software metrics and measurements

51
University of Southern California Center for Systems and Software Engineering Software Metrics and Measurements Supannika Koolmanojwong CS510 1

Upload: hana

Post on 25-Feb-2016

50 views

Category:

Documents


2 download

DESCRIPTION

Software Metrics and Measurements. Supannika Koolmanojwong CS510. Outline. General Concepts about Metrics Example of Metrics Agile Metrics Metrics from Empirical Data. Measurements in daily life. Why do we measure?. Objectives of software measurement. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Software Metrics and Measurements

Supannika KoolmanojwongCS510

1

Page 2: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Outline

• General Concepts about Metrics• Example of Metrics• Agile Metrics• Metrics from Empirical Data

2

Page 3: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Measurements in daily life

3

Page 4: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Why do we measure?

4

Page 5: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Objectives of software measurement

• “You can not control what you cannot measure.” – Tom DeMarco

• “Not everything that counts can be counted. Not everything that is counted counts.” – Albert Einstein

5

Page 6: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Software Metrics

• Numerical data related to software development

• Strongly support software project management activities

• Can be directly observable quantities or can be derived from one

6

Page 7: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

A simplified measurement information model

7Ref: Ebert and Dumke 2007

Decisions / Actions Measurements

Process Work Products

Information products

Information needs

Attributes

Results

Information Needs,

Objectives, Control

Page 8: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

How the software measurements are used?

• Understand and communicate• Specify and achieve objectives• Identify and resolve problems• Decide and Improve

8

Page 9: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Measurement Standard

9

ISO/IEC 12207 Software Life Cycle Processes

ISO/IEC 15288 System Life Cycle processes

SWEBOKSoftware Engineering Body of Knowledge

PMBOKProject Management Body of Knowledge

CMMICapability Maturity Model Integration

ISO 15504Software Process Capability Determination

ISO 9001Quality Management System

ISO/IEC 9126Software Product QualityTL 9000, AS 9100, etc.Objectives adaptations

How to do How to do better

ISO/IEC 15939:2002Software Measurement Process

How to measure what you are doing

Page 10: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Ground rules for a Metrics• Metrics must be

– Understandable to be useful– Economical– Field tested– Highly leveraged– Timely– Must give proper incentives for process improvement– Evenly spaced throughout all phases of development– Useful at multiple levels

10http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf

Page 11: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Measurements for Senior Management

• Easy and reliable visibility of business performance

• Forecasts and indicators where action is needed

• Drill-down into underlying information and commitments

• Flexible resource refocus

11

Page 12: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Measurements for Project Management

• Immediate project reviews• Status and forecasts for quality, schedule,

and budget• Follow-up action points• Report based on consistent raw data

12

Page 13: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Project management supporting metrics1. Planning - Metrics serve as a basis of cost

estimating, training planning, resource planning, scheduling, and budgeting.

2. Organizing - Size and schedule metrics influence a project's organization.

3. Controlling - Metrics are used to status and track software development activities for compliance to plans.

4. Improving - Metrics are used as a tool for process improvement and to identify where improvement efforts should be concentrated and measure the effects of process improvement efforts.

13

Page 14: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Measurements for Engineers

• Immediate access to team planning and progress

• Get visibility into own performance and how it can be improved

• Indicators that show weak spots in deliverables

• Focus energy on software development

14

Page 15: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

The E4-Measurement Process

15

Objectives, needs

Business Process

Environment, resources

1. Establish 2. Extract 3. Evaluate 4. Execute

Decisions, re-direction,

updated plans

Ref: Ebert and Dumke 2007

Page 16: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Aggregation of informationEnterprise

Division

Product Line/ Department

Projects

16

Cycle time, quality, cost, productivity, customer

satisfaction, resources, skills

Sales, cost reduction, innovative products, level of

customization

Cost reduction, Sakes, margins, Customer Service

Cash flow, Shareholder value, Operations cost

Page 17: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

SMART goals

• Specific - precise• Measurable - tangible• Accountable – in line with individual responsibilities

• Realistic - achievable• Timely – suitable for the current needs

17

Page 18: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

What do you want to measure?

• Processes– Software-related activities

• Products– Artifacts, deliverables, documents

• Resources– The items which are inputs to the process

18

Page 19: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Components of software measurements

19

Page 20: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Example of Metrics

• Progress / Effort / Cost Indicator• Earned value management• Requirements / Code Churn• Defect-related metrics• Test-related metrics

20

Page 21: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Size

• How big is the healthcare.gov website? – http://www.informationisbeautiful.net/

visualizations/million-lines-of-code/

21

Page 22: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Size• Earth System Modeling Framework Project

22http://www.earthsystemmodeling.org/metrics/sloc.shtml

Page 23: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Progress Indicator

23

Page 24: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Effort Indicator

24

Page 25: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Cost Indicator

25

Page 26: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Earned value management• Planned Value (PV) or Budgeted Cost of Work Scheduled (BCWS)• Earned Value (EV) or Budgeted Cost of Work Performed (BCWP)

26http://en.wikipedia.org/wiki/Earned_value_management

Page 27: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Burndown Chart

27http://en.wikipedia.org/wiki/Burn_down_chart

Page 28: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Requirements Churn/ Requirements Creep/

Requirements Volatility• number of changes to system requirements

in each phase/week/increment

28

Page 29: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Code Churn• Software change history• Large / recent changes• Total added, modified and deleted LOC• Number of times that a binary was edited• Number of consecutive edits

29

Page 30: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Code Complexity• Gathered from code itself• Multiple complexity values• Cyclomatic complexity• Fan-In / Fan-Out of functions• Lines of Code• Weighted methods per class• Depth of Inheritance• Coupling between objects• Number of subclasses• Total global variables

30

Page 31: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Code coverage

• Degree to which the source code is tested• Statement coverage

– Has each node in the program been executed?• Branch coverage

– Has each control structure been evaluated both to true and false?

31

Page 32: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Code Coverage

32http://www.firstlinesoftware.com/metrics_group2.html

Page 33: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

JUnit Code CoverageTool instruments byte code with extra code to measure

which statements are and are not reached.

33

Package Level Converage

A Code Coverage Report

Line Level Coverage

http://www.cafeaulait.org/slides/albany/codecoverage/Measuring_JUnit_Code_Coverage.html

Page 34: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Defect reporting metric

• Can be categorized by – Status

• Remaining / Resolved / Found– Defect Sources

• Requirements / Design / Development– Defect found

• Peer review / unit testing / sanity check– Time

• Defect arrival rate / Defect age

34

Page 35: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Defect Status

35

Page 36: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Defect Density

36

Page 37: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Test Pass Coverage

37http://www.jrothman.com/Papers/QW96.html

Page 38: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Defect Density

38

Page 39: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Defect Per LOC

39

Page 40: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Developer Code Review

40

After 60‒90 minutes, our ability to find defects drops off precipitously

http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/

Page 41: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

41

As the size of the code under review increases, our ability to find all the defects decreases. Don’t review more than 400 lines of code at a time.

http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/

Page 42: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Top 6 Agile Metrics

42Ref: Measuring Agility, Peter Behrens

Page 43: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

43Ref: Measuring Agility, Peter Behrens

Velocity = Work Completed per sprint

Page 44: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

44

Page 45: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Measurements in organizational level

• Empirical analysis• Change from the top

45

Page 46: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

46Richard W. Selby, Northrop Grumman Space Technology, ICSP '09Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems"

Page 47: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

47

Page 48: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

48

Page 49: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Measurements for progress vs predictionsProject Phase For Measurements For PredictionsProject Management

-Effort and Budget Tracking- Requirements Status-Task Status

-Top 10 risks- Cost to complete- Schedule evolution

Quality Management

-Code Stability-Open defects-Review status and follow up

-Residual defects-Reliability-Customer satisfaction

Requirements Management

-Analysis status -Specification progress

-Requirements volatility / completeness

Construction -Status of documents- Change requests-Review status

-Design progress of reqm- Cost to complete-Time to complete

Test Test progress (defects, coverage, efficiency, stability

- Residual defects- reliability

Transition, deployment

-Field performance (failure, corrections) - maintenance effort

-Reliability-Maintenance effort

49Ref: Ebert and Dumke, 2007

Page 50: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

Recommended books

50

Software Measurement: Establish - Extract - Evaluate – Execute by Christof Ebert, Reiner Dumke (2010)

Practical Software Measurement: Objective Information for Decision Makers by John McGarry, David Card, Cheryl Jones and Beth Layman (Oct 27, 2001)

Page 51: Software Metrics and Measurements

University of Southern California

Center for Systems and Software Engineering

References• http://sunset.usc.edu/classes/cs577b_2001/metricsguide/metrics.html• Fenton NE, Software Metrics: A Rigorous Approach, Chapman and Hall,

1991.• http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf• Christof Ebert, Reiner Dumke, Software measurement: establish, extract,

evaluate, execute, Springer 2007• http://se.inf.ethz.ch/old/teaching/2010-S/0276/slides/kissling.pdf• Richard W. Selby, Northrop Grumman Space Technology, ICSP '09

Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems“

• Measuring Agility, Peter Behrens• [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and

Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099, November 5, 1991.

51