metrics based software supplier selection

28

Upload: hanskuijpers

Post on 17-Jan-2015

190 views

Category:

Documents


2 download

DESCRIPTION

Abstract-This presentation provides insight into a 'best practice' used for the selection of software suppliers at the largest Dutch telecom operator, KPN. It is related to the paper with the same title, also available on slideshare.

TRANSCRIPT

Page 1: Metrics Based Software Supplier Selection
Page 2: Metrics Based Software Supplier Selection

Metrics Based Software Supplier Selection Best practice used in the largest Dutch telecom company

Assisi, October 2012

Hans Kuijpers

Harold van Heeringen

Page 3: Metrics Based Software Supplier Selection

Sogeti Nederland BV:

Senior Consultant Software Metrics

ISBSG: President

NESMA: Board Member

NESMA: Working Group Chair COSMIC

NESMA: Working Group Chair Benchmarking

COSMIC: IAC Member

KPN Nederland:

Manager Metrics Desk

Certified Scope Manager

QSM: Special Interest Group Agile

Metrics Based Software Supplier Selection Pagina 2

Introduction

Harold van Heeringen

Sizing, Estimating & Control

[email protected]

@haroldveendam

@Sogeti_SEC

Hans Kuijpers

Program Assurance & Methods

hans.tm.kuijpers@kpn,.com

@_hanskuijpers

Page 4: Metrics Based Software Supplier Selection

Agenda

• Introduction and Context

• Phases and Timeline

• The Model

• Results

• Conclusions & Recommendations

Metrics Based Software Supplier Selection - 3 -

Page 5: Metrics Based Software Supplier Selection

Introduction and Context

.Why supplier selection?

• Consolidation # suppliers

• Cost reduction

• Supplier acts as SI & MSP

• 5 Year investment

• SPM is a best practice at KPN

Metrics Based Software Supplier Selection - 4 -

KPN Board

Consumer Market

Business Market

Corporate Market NetCo

Fixed

OSS

Mobile Customer

Experience

BSS Generic & Traditional

Wholesale IT Operations

E-Plus KPN

Belgium

Revenues: €13.000m

EBITDA: € 5.100m

Employees: 31.000

Domains

Problem: no more competition between suppliers

instrument needed to avoid excessive cost Unit of Work pricing

Page 6: Metrics Based Software Supplier Selection

Agenda

• Introduction and Context

• Phases and Timeline

• The Model

• Results

• Conclusions & Recommendations

Metrics Based Software Supplier Selection - 5 -

Page 7: Metrics Based Software Supplier Selection

Phases and Timeline

Why Productivity metric added?

• Objective selection criteria

• Supplier willingness to show their transparency

• Basis for productivity baseline

• Insight in quality level

• Negotiations for year on year cost reduction

• Relation to continous improvement steps

Metrics Based Software Supplier Selection - 6 -

Page 8: Metrics Based Software Supplier Selection

Requested project information

• Data of 6 historical projects, max 3 KPN projects

• In scope of current technology domain

• Range 300 – 1000 FP

• Sizing method NESMA 2.1 or IFPUG 4.x

• DCF must be completely filled out

• No other template is allowed

In BAFO phase suppliers should show evidence of the size and productivity

figures by releasing FPA-reports, Data Collection Forms and/or insight in their

administrative systems.

Metrics Based Software Supplier Selection - 7 -

Page 9: Metrics Based Software Supplier Selection

Historical Project Data form (1)

Metrics Based Software Supplier Selection - 8 -

Page 10: Metrics Based Software Supplier Selection

Historical Project Data form (2)

Metrics Based Software Supplier Selection - 9 -

Per data field

requirements are

mentioned in the

template.

Page 11: Metrics Based Software Supplier Selection

Agenda

• Introduction and Context

• Phases and Timeline

• The Model

• Results

• Conclusions & Recommendations

Metrics Based Software Supplier Selection - 10 -

Page 12: Metrics Based Software Supplier Selection

The Model

Characteristics:

• Degree of openess and compliancy

• Completeness and cohesion of submitted data

• Productivity benchmark against each other and industry

• Delivered Quality

• During the RFP phase the data will be considered as correct, but will be

checked on reality

The 3 test criteria:

A. Compliancy value (10%)

B. Reality value (30%)

C. Productivity - Quality value (60%)

Metrics Based Software Supplier Selection - 11 -

Page 13: Metrics Based Software Supplier Selection

Used metrics and benchmarks

Project Delivery Rate (PDR) = spent project effort related to function point (h/FP)

Productivity Index (PI) = metric from QSM, derived from size, duration and effort

Quality: delivered defects per FP

Benchmarks:

• PI against the QSM Business trend line

• PDR against ISBSG Repository

• Adjusted = normalised to Construction+Test activities

Metrics Based Software Supplier Selection - 12 -

Page 14: Metrics Based Software Supplier Selection

Compliancy value (10%)

Suppliers start with 10 points

The compliancy value is substracted with 2 points for each violation of rule:

a)Range 300 – 1000 Function Points

b)Method NESMA 2.1 or IFPUG 4.x

c)Each field of “Historical Project Data”-form must be filled out

Maximum value = 10, minimum value = 0

Metrics Based Software Supplier Selection - 13 -

Page 15: Metrics Based Software Supplier Selection

Reality value (30%)

Unrealistic projects are discarded

from further analysis:

• PI > +2 sigma (95%)

• PDR < P25 ISBSG (best in class

projects)

The reality value is substracted with

2 points for each unrealistic project

Maximum value = 10

Minimum value = 0

Metrics Based Software Supplier Selection - 14 -

PI vs. Functional size (FP)

50 150 250 350 450 550 650 750 850 950

Effective FP

0

5

10

15

20

25

30

35

PI

Supplier A

Supplier B

Supplier C

Supplier D

Supplier E

QSM Business

Avg. Line Sty le

2 Sigma Line Sty le

Unrealistic

Functional Size (FP)

Page 16: Metrics Based Software Supplier Selection

Productivity - Quality value (60%)

Metrics Based Software Supplier Selection - 15 -

ID PDR (h/FP) PDR ISBSG median PDR score

7 5,9 8,6 -2,7

8 6,0 8,6 -2,6

9 6,9 8,6 -1,7

11 6,2 8,6 -2,4

12 7,3 8,6 -1,3

-2,1Average:

ID Defects/FP Quality score

15 41,7

18 13,9

21 66,7

22 4,0

23 10,0

13,9Median

The highest average most points

The lowest average most points The lowest median most points

(Points PI score * 0,5) + (Points PDR score * 0,3) + (Points Quality score * 0,2)

Productivity - Quality value =

For PI and PDR the average of the distance to the benchmark value is determined

For the quality the median is dertermined

Page 17: Metrics Based Software Supplier Selection

Agenda

• Introduction and Context

• Phases and Timeline

• The Model

• Results

• Conclusions & Recommendations

Metrics Based Software Supplier Selection - 16 -

Page 18: Metrics Based Software Supplier Selection

Results of Compliancy (1)

Metrics Based Software Supplier Selection - 17 -

Projects discarded:

• Projects on going (4)

• Project sized in COSMIC (1)

Blank crucial fields

• Defect data

• Effort data

• Dates

Other violations

• Primary Language (example English)

Supplier Compliancy Value

Supplier A 0

Supplier B 0

Supplier C 4

Supplier D 0

Supplier E 0

Result: 1 supplier has 3 violations,

the others 5 or more

Page 19: Metrics Based Software Supplier Selection

Results of Compliancy (2)

Metrics Based Software Supplier Selection - 18 -

Page 20: Metrics Based Software Supplier Selection

Results of Reality

Projects unrealistic:

• 3 according to PI

• 1 according to PDR

Discarded for further analysis

Metrics Based Software Supplier Selection - 19 -

Supplier

Unrealistic

projects PI

criterion

Unrealistic

projects PDR

criterion Reality Value

Supplier A 1 1 6

Supplier B 1 0 8

Supplier C 0 0 10

Supplier D 0 0 10

Supplier E 1 0 8

Page 21: Metrics Based Software Supplier Selection

Results of Productivity / Quality

Metrics Based Software Supplier Selection - 20 -

Supplier PI score

Rank

PI score

Points

PI score

Supplier A 3,9 2 8

Supplier B 5,0 1 10

Supplier C 3,4 3 6

Supplier D 3,0 5 2

Supplier E 3,2 4 4

Supplier

PDR

score

Rank

PDR score

Points

PDR score

Supplier A -3,2 1 10

Supplier B -2,1 2 8

Supplier C 16,6 4 4

Supplier D 6,2 3 6

Supplier E 18,3 5 2

Supplier

Quality

Score

Rank

Quality score

Points

Quality score

Supplier A 3,1 1 10

Supplier B 13,9 2 8

Supplier C 52,6 3 6

Supplier D 1000,0 5 2

Supplier E 94,6 4 4

Supplier

Points

PI score

Points

PDR score

Points

Quality score

Productivity/

Quality value

Supplier A 8 10 10 9,0

Supplier B 10 8 8 9,0

Supplier C 6 4 6 5,4

Supplier D 2 6 2 3,2

Supplier E 4 2 4 3,4

weight 50% 30% 20%

+ +

=

Page 22: Metrics Based Software Supplier Selection

Results of Total Assessment

Metrics Based Software Supplier Selection - 21 -

Recommendation from Metrics Desk: Supplier B and A score best in the model

Supplier

Compliancy

value

Reality

value

Productivity/

Quality value

Total

Points Rank

Supplier A 0 6 9,0 7,2 2

Supplier B 0 8 9,0 7,8 1

Supplier C 4 10 5,4 6,6 3

Supplier D 0 10 3,2 4,9 4

Supplier E 0 8 3,4 4,4 5

weight 10% 30% 60%

However suppliers B and C were selected for the next BAFO phase

Page 23: Metrics Based Software Supplier Selection

Findings BAFO phase

Metrics Desk investigated the provided project data of the selected suppliers B + C:

• Size

• Dates

• Hours

• Defects

Supplier B:

• Resistance: confidentiality clause with clients

• Client site visit

Supplier C:

• Size measurement by junior not certified measurers

Metrics Based Software Supplier Selection - 22 -

Page 24: Metrics Based Software Supplier Selection

Agenda

• Introduction and Context

• Phases and Timeline

• The Model

• Results

• Conclusions & Recommendations

Metrics Based Software Supplier Selection - 23 -

Page 25: Metrics Based Software Supplier Selection

Conclusions and recommendations

• The productivity assessment influenced the total outcome significantly

• The assessment and discussions afterwards gave insight in:

• Transparency and CMMI level

• The results are used in the negotiations phase to maximize the baseline value

• Make sure the parties understand:

• the purpose of the assessment

• the use of the “Historical Project Data” form

• that the disclosed data will be validated and should not be confidental

• the consequences of violating the governance rules (e.g. penalty points)

• Because of many violations of the compliancy rules, consider 1 penalty point

per violation

• Construct model beforehand, but don’t communicate the model with suppliers

• Bring site visits when offered. This gives extra information next to the

productivity validation

Metrics Based Software Supplier Selection - 24 -

Conclusions:

Recommendations:

Page 26: Metrics Based Software Supplier Selection

Productivity: don’t trust it, check it

Metrics Based Software Supplier Selection - 25 -

Hans Kuijpers

Software Metrics Consultant

hans.tm.kuijpers@kpn,.com

@_hanskuijpers

Harold van Heeringen

Sizing, Estimating & Control

[email protected]

@haroldveendam

@Sogeti_SEC

Page 27: Metrics Based Software Supplier Selection

Back-up sheets

Metrics Based Software Supplier Selection - 26 -

Page 28: Metrics Based Software Supplier Selection

Productivity index

(PI)

Effo

rt

Duration

Productivity Index=18

Productivity Index=16

500 FP

10 h/FP

7 h/FP

17 h/FP

14 h/FP

9 h/FP

- 27 -