it productivity measurement program - software value cast webinar_it... · 2014-08-06 · it...
TRANSCRIPT
IT Productivity Measurement Program
2 CAST Confidential
Co
nfi
de
nti
al
Speakers
David Herron
VP Solution Services
David Consulting Group
Philippe Guerin
Director, Field Engineering
CAST
3 CAST Confidential
Co
nfi
de
nti
al
The Business Value of IT
Making Informed Decisions
Establishing an Information Framework
Creating and Using Meaningful Measures
The Importance of Consistency
Measuring Performance
Introduction
4 CAST Confidential
Co
nfi
de
nti
al
Business Impact of IT
Strategic
Positioning
(Business &
Technical)
Improve Productivity
Reduce Time to Market
Increase Quality
Lower
Costs
Shareholder
Value
Satisfy Customer
Improve Competitive Position
Increase
Market
Share
Improve
Margins
Continuous
Process
Improvement
Deliver
Value
Increase
Revenues
IT Business
5 CAST Confidential
Co
nfi
de
nti
al
Characteristics:
Reliable Decision-making Data
Issue Address by: Data must be:
Information arrives too late to
be useful.
Data collection, processing and reporting
must be streamlined. Timely
Reports are too detailed,
or “down in the weeds.”
Appropriate summarization and analysis
must be provided based on users’ need
and role.
Relevant
Information system provides
partial data.
Data completeness must be based on
reporting needs. Complete
Data is not considered
“trustworthy.”
Establish accuracy checks and
verifications. Reliable
Reports reveal problem trends,
but don’t provide detail.
Reports must provide underlying
information, not just summaries. Allow
“drilling down”
6 CAST Confidential
Co
nfi
de
nti
al
Governance – Making the Right Decisions
What decisions need to be made?
Who makes those decisions?
What information is available?
How effectively do we make decisions?
What’s the impact of poorly made decisions?
7 CAST Confidential
Co
nfi
de
nti
al
EnterpriseEnterprise
Overall Information Framework
Project X Project Y Project Z
Process Management
Performance
Measures
Enterprise
Database
Historical
Measures End User
Process
Measures Impro
ve Contr
ol
Define
Measure
Exec
ute
Process
Measurement
Repository
PAL
Executive Management
Dashboard
Project Defect Status
0
100
200
300
400
500
600
700
800
900
1000
Jan'08
Feb'08
Mar
'08
Apr'0
8
May
'08
Jun'08
Jul'0
8
Aug'08
Sep'08
Oct'08
# o
f D
efe
cts
Total Defects Discovered Total Closed Defects
Requirements Growth and Stability
-50
0
50
100
150
200
Jan'08
Feb'08
Mar
'08
Apr
'08
May
'08
Jun'08
Jul'0
8
Aug
'08
Sep
'08
Oct'08
Nov
'08
Dec
'08
# o
f Requir
em
ents
Added Changed Deleted Total Reqs
Project Resource and Effort Status
0
200
400
600
800
1,000
1,200
1,400
1,600
Jan'08
Feb'08
Mar
'08
Apr'0
8
May
'08
Jun'08
Jul'0
8
Aug'08
Sep'08
Oct'08
Nov
'08
Dec
'08
Pro
ject
Resourc
es/H
ours
Cum Planned Effort Allocated Cum Actual Effort Spent
"Earned Value" Baseline Total Hours
Milestone Baseline Plan Actual
%
Var
Checkpoint A – Charter & Kickoff 1/10/2008 1/10/2008 1/10/2008 0%
Requirements Complete 1/28/2008 1/28/2008 1/28/2008 0%
Vendor Selection Complete 2/4/2008 2/4/2008 2/15/2008 7%
PMP/Schedule Complete 2/12/2008 2/12/2008 2/28/2008 11%
Checkpoint B– Planning & Reqs 2/28/2008 3/15/2008 11%
Design Complete 3/15/2008 4/15/2008 20%
Development Complete 4/15/2008 4/30/2008 10%
Checkpoint C– Midpoint 4/30/2008 5/15/2008 10%
Testing Complete 4/30/2008 5/15/2008 10%
Training Complete 5/10/2008 5/30/2008 13%
Go Live 5/30/2008 6/15/2008 11%
Lessons Learned/Cust Sat Survey Complete 6/1/2008 6/30/2008 19%
Checkpoint D – Deploy & Close 6/1/2008 6/30/2008 19%
Baseline
Project
Estimates
Business
Decisions
ProcessProcess
ProjectProject
8 CAST Confidential
Co
nfi
de
nti
al
Multi-Level Measurement
Enterprise Management
OrganizationalManagement
ProjectManagement
Performance MeasurementNormative Performance BaselinesTechnical and Business PolicyInvestment Decisions & Analysis
Process ImprovementProject Planning GuidelinesPerformance Based GuidelinesOrganizational Norms & Benchmarks
Project Estimation & PlanningProject Performance TrackingProject Tradeoff AnalysisResource Management
RiskManagement
Process
Information - DrivenMeasurement
Process
9 CAST Confidential
Co
nfi
de
nti
al
Measurement Pallet
•Product Quality
•Change Rate (AM)
•Application Retirement
•Program Support Cost
•Project Backlog Age
•Process Maturity
•Productivity (AD)
•Assignment Scope (AM)
•Throughput
•Satisfaction •ROI •Measurement Issue Escalation Rate
Deliver Value Deliver
Efficiently
Deliver Effectively
Support a Balanced
Infrastructure
Application Development Primary
Size Unit of Measure:
• IFPUG Function Points
• QEFP Variant
Application Maintenance Primary
Size Unit of Measure:
• Function Points
• Cast Points
• Calibrated Tickets
10 CAST Confidential
Co
nfi
de
nti
al
POLL QUESTION
Make sure you submit your response!
11 CAST Confidential
Co
nfi
de
nti
al
•Technical Quality Index
•Health Factors
•Compliance Best Practices
•Critical Violations
•New Critical Violations…
Metrics for Productivity Measurement
•Function Points
•Enhancement Function Points
•Artifacts
•Artifacts processed
•Lines of code …
•Effort complexity
•SQL complexity
•Algorithm complexity
•Coupling …
Improve the productivity of ADM
Reuse of components and usage of frameworks
Early detection of defects and avoiding back and forth between Dev and QA or Dev and Prod
Defect reduction to reduce maintenance cost
Measure the environmental factors
Several metrics used to measure output
(tracking Technical and functional modifications)
Normalized for language expressiveness
Effort complexity measurement
Compliance to best practices
12 CAST Confidential
Co
nfi
de
nti
al
Measure Productivity via CAST AIP
Output of developers measured in Function Points processed
,artifacts processed, Lines of Code etc.
Effort that went into building or modifying the application.
Raw Productivity = Output
Effort = F( ) (Quantity Loc OR EFP OR Artifacts Delta )
(Cost of the release at dev, QA or until prod)
Quality from CAST AIP
Quantity from CAST AIP
Defect tracking tools
Project duration from proprietary tool
The output needs to be normalized for language expressiveness, domain complexity,
and quality…
The effort should be normalized for the portion of the life cycle and tasks included
Quality Adjusted Productivity Raw Productivity = x D TQI*
Effort complexity from CAST AIP
*TQI: Technical Quality Index
13 CAST Confidential
Co
nfi
de
nti
al
Why Use CAST AIP?
• Automation ensures consistency, reliability and objectivity. Automated
• Once set up, unlimited estimations can be done as needed at no additional cost.
Cost -effective
• A manual count can be use to calibrate an application analysis and this calibration will be automated in order to gain the benefit of this adjustment overtime.
Ensure consistent measure
• Previous versions of the application can be used as baseline, based on the source code (a typical fortune 500 company can have as much as 30% of their portfolio not documented or poorly documented).
No documentation needed
• CAST will analyze any size application and provide consistent FP estimation.
Analyze entire application
• CAST output provides value through additional information on cost, quality, quantity, complexity and automated technical documentation.
Improve productivity
14 CAST Confidential
Co
nfi
de
nti
al
Overall risk of the last releaseOverall risk of the last release
Overall quality of the applicationOverall quality of the application
Quality Measurement
High-level quality indicators
Assess the risk associated with the business concern
Technical debt / TQI / Health factors
High-level quality indicators
Assess the risk associated with the business concern
Technical debt / TQI / Health factors
Intermediate quality indicators
Assess technological areas / aspects of development
Intermediate quality indicators
Assess technological areas / aspects of development
Raw measures
Directly performed on the source code
Measure compliance with development rules
Measure architecture compliance rules
Measure cross-technology compliance rules
Raw measures
Directly performed on the source code
Measure compliance with development rules
Measure architecture compliance rules
Measure cross-technology compliance rules
Immediate action
Directly performed on the source code
Measure compliance with critical rules
Measure compliance rules on highly use components
Measure compliance rules by transaction
Immediate action
Directly performed on the source code
Measure compliance with critical rules
Measure compliance rules on highly use components
Measure compliance rules by transaction
15 CAST Confidential
Co
nfi
de
nti
al
Complexity Measurement
Complexity
Measurement
Algorithmic complexity Thresholds •Simple
•medium
•complex
•very complex
Cyclomatic complexity (count
program and control decision
statements)
SQL complexity Thresholds •Simple
•medium
•complex
•very complex
Raw SQL Complexity (based on
# of tables, # of Subqueries, # of
FROM Clauses and other
GROUP BY per query)
Coupling (Fan in, Fan out) Thresholds •Simple
•medium
•complex
•very complex
Number of Links per
components from or to the
component measured
Ratio of documentation Thresholds •Simple
•medium
•Lack of
comments
•Not
documented
( # of lines of comments - # of
bad comments) / # of line of
code
Size of components Thresholds •Small
•medium
•Large
•very Large
# of lines of code
Files
Effort and Cost Estimation The Complexity through the
workload table will provide the
effort of a specific release
Effort and Cost Estimation The Complexity through the
workload table will provide the
effort of a specific release
Propagated Risk Index The Complexity through the
quality model will provide the
propagated risk index.
Propagated Risk Index The Complexity through the
quality model will provide the
propagated risk index.
16 CAST Confidential
Co
nfi
de
nti
al
Finding: Nine out of ten programs/projects that fail
have not been properly sized
Consider:
When you build a house you specify all the functions and
features you want – these are your requirements
The builder then generates an estimated budget based
on the size (square footage) of your requirements.
Size is the key to effectively managing software
deliverables
Quantity Measurement
17 CAST Confidential
Co
nfi
de
nti
al
Organizational Specific
Definitions Industry Defined
Modules, Use Cases, Test Cases
Story Points
Lines of Code
Use Case Points
Function Points
Less Accurate More Accurate
Hours,
Days
Story
Points Lines of
Code
Use Case
Points Function Points
Fewer Rules
More Rules
Hours, Days
Story Points
Use Case Points
Function Points
Easier to Learn
Harder to Learn
Power / Ease Index
Power Increases
Ease of Use
Sizing Options
Internal / External DefinitionsInternal / External Definitions Power / Ease of UsePower / Ease of Use
Consistency / AccuracyConsistency / Accuracy
18 CAST Confidential
Co
nfi
de
nti
al
Function Point Analysis is a standardized method for measuring the
functionality delivered to an end user.
• Consistent method
• Easy to learn
• Available early in the lifecycle
• Acceptable level of accuracy
• Meaningful internally and externally
Function Point counts have replaced Line of Code counts as a sizing metric
that can be used consistently and with a high degree of accuracy.
CAST developed supplemental parsing logic to count IFPUG function points
through the code analyzer modules then partnered with DCG to improve the
calibration of this logic.
The counting rules that express the IFPUG standard for counting function
points provide a simulation of a manual count with repeatability.
Why Function Points?
19 CAST Confidential
Co
nfi
de
nti
al
Quantity Measurement: Function Points
Measure the number of transactions managed by the application in order to measure the amount of functionality
Measure the number of modifications (added, updated, deleted) between two measures
20 CAST Confidential
Co
nfi
de
nti
al
Calibration Process
The application boundary is sometimes
not exactly the same as the input. For
example, when you have some third
party tools or databases used
externally and not internally, defining
the boundary properly will help.
Batch files, especially on distributed
applications are sometimes missed. A
review of the input – the source code
of the application – can help with the
final count. A missing layer or some
over-delivery can have a huge impact
on the final estimation.
By default, transaction functions are
flagged automatically based on the
type of objects. In some cases –
mainly due to the nature of the
application – some additional entries
need to be defined in order to have a
more accurate estimation.
By default, data functions are flagged
automatically based on the type of
objects, and in some cases it makes
sense to flag APIs of third party tools
as data entities or flag some specific
outputs of the application.
21 CAST Confidential
Co
nfi
de
nti
al
Outputs: Application Function Points
22 CAST Confidential
Co
nfi
de
nti
al
Compiling Baseline Data
Effort complexity from CAST AIP
Quality from CAST AIP
Quantity from CAST AIP
Defect from HP Quality Center
Project duration from proprietary tool
23 CAST Confidential
Co
nfi
de
nti
al
A Measurement Baseline Model
Process
Methods Skills
Tools
Management
Measured Performance
Capability
Maturity
Baseline of
Performance
Measures how you are doing
Identifies what you
are doing
Standard of performance
QUANTITATIVE
QUALITATIVE
Size
Effort
Duration
Cost
Quality
Opportunities
For Improvement
Best
Practices
24 CAST Confidential
Co
nfi
de
nti
al
Benchmark Industry Standards
Numerous sources of industry benchmark data; e.g., Jones, ISBSG,
DCG
Majority of quantitative benchmark data is function point based
Common data points include productivity, quality, time to market and
cost
Industry data is typically segmented by
Industry Type
Project Type
Platform
Database Type
Methodology
Language
25 CAST Confidential
Co
nfi
de
nti
al
POLL QUESTION
Make sure you submit your response!
26 CAST Confidential
Co
nfi
de
nti
al
Dashboard / Service Levels
Measure Name Calculation Notes
Industry
(Median)
primarily Level
3 organizations
Goal by
2012
Estimating Accuracy -
Effort
(actual labor hours -
estimated) / estimated
positive values represent
overruns; negative under
runs
0%
18%
Estimating Accuracy -
Schedule
(actual calendar months -
estimated) / estimated
positive values represent
overruns; negative under
runs
0%
18%
Productivity function points / labor months varies with project size 26 20
Unit Cost dollars / function points Dollars are estimated from
labor hours @ $110 per hour
* 145 hrs per staff month
$613
$800
System Delivery Rate function points / calendar months Value is a mean - median
not available
49 40
Requirements volatility added, changed, deleted / total
baseline requirements
For all but one project, data
not available. Project
manager gave an estimate
10%
15%
Client Satisfaction ratings by project manager For all but three projects,
ratings by clients
unavailable.
Not available
4
System Test -
Effectiveness
defects found in system test /
total defects
total defects = defects found
in system test + defects
found in production (first 30
days)
90%
90%
Delivered Defect - Density
(Defects per 100 function
points)
(defects found in production /
function points) * 100
production = first 30 days 1.3
1.8
27 CAST Confidential
Co
nfi
de
nti
al
Right decisions, right measures
Information framework
All information needs to be collected via a repeatable process
Measurement is key
All high level metrics need to be based on some tangible information
Baseline maintenance
All measurement baselines should be performed in alignment with
the other baseline
All data elements need to be available in each area you want to
measure
All data elements need to be normalized
Consistent framework of data collection
All the measurements should be performed in alignment with the
baseline
Summary
28 CAST Confidential
Co
nfi
de
nti
al
Q & A
29 CAST Confidential
Co
nfi
de
nti
al
Upcoming Events
CAST Webinar: Get Smart Technical Debt
Thursday, April 19 @ 11:00am EDT
Speaker: David Norton, Senior Analyst, Gartner
Research
DCG Event: QUEST Conference
April 30-May 2, Chicago
DCG Sessions: Tutorial: Estimation Clinic: Methods,
Maturity, and Money; Agile is from Venus and PMOs
from Mars; Identifying Your Organization's Best
Software Practices
30 CAST Confidential
Co
nfi
de
nti
al
Philippe Guerin David Herron
[email protected] [email protected]
www.castsoftware.com www.davidconsultinggroup.com
blog.castsoftware.com http://www.davidconsultinggroup.com/
blogs/index.php
@OnQuality
slideshare.net/castsoftware
Contact Information