software cost/quality modeling - macos...
TRANSCRIPT
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 1
Software Cost/Quality ModelingSunita Devnani-ChulaniGraduate Assistant, USC
COCOMO II Affiliates’ MeetingMarch 10, 1997
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 2
Presentation Outline
=> Motivation
• The Software Defect Introduction and Removal Model
• A-Priori Software Quality Model
• Plans
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 3
Motivation
• Insight on Determining Ship Time
• Assessment of Quality Investment Payoffs
• Understanding of Quality Strategy Interactions
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 4
Presentation Outline
• Motivation
=> The Software Defect Introduction and Removal Model
• A-Priori Software Quality Model
• Plans
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 5
‘The S/W Defect Introduction and RemovalModel’ (SEE, Barry Boehm)
or ‘Tank and Pipe Model’ (Capers Jones)
‘Defects conceptually flow into a holding tank throughvarious defect-source pipes & are drained off through variousdefect-elimination pipes’.
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 6
Defect Introduction and Removal During Software Development
Jones Thayer & others Boehm
Defects introduced Overall rate 30-35/KDSI 40-80/KDSIa 65-85/KDSI
Percentage by component Requirements 10% 8-10%
Functional Design 15% 15-20
Logical Design 20 25-35 Coding 30 35 25
Documentation 35 17-20
Defects removed Function Logic Coding Automated requirements aids 63b
Functional specifications review 50 45-60
Simulation 21
Design Language 32
Design Standards 29 Logic specifications review 40 50 50-60
Module logic inspection 60 70 58
Module code inspection 65 75 70 63
Code standards auditor 20
Set/use analyzer 14 Unit test 10 10 25
Function test 20 25 55
Component test 15 20 65 50
Subsystem test 15 15 55
System test 10 10 40 46 50
a Equivalent figure. Reported rate (10-20/KDSI) covered only post-integration test defects discovered.b Bell & Thayer “Software Requirements Are They Really A Problem?” IEEE Proceedings, 2nd Int. Conf. on SE, Oct 1976, pp 61-68
} 55
} 73
} 46
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 7
Two Different Approaches beingresearched at USC
Reqts Design Coding Docu
1 2 3 4 5 6 7 8
DeliveredDefects
P(N)
Sunita’s ResearchModel
Allen’s ResearchModel
MarkovTransition Model
Tank and PipeModel
COCOMO IICost drivers
Use of tools and techniques
No. of remaining faults
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 8
Presentation Outline
• Motivation
• The Software Defect Introduction and Removal Model
=> A-Priori Software Quality Model
• Plans
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 9
A-Priori Software Quality Model• Defect Introduction Model
– Baseline rates for each type of artifact*
– Rates adjusted via COCOMO II cost-drivers + DISC (DisciplinedMethods)
– Initial model ready for review and iteration
• Defect Removal Model– Rates for each type of artifact* determined from project’s defect
removal activity levels
– Reviews, inspections, analysis tools, tests
– Initial Model TBD
• Evolve to a-posteriori model via data collection/analysis*Types of Artifacts : Requirements, design, code, documentation
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 10
A-Priori Defect IntroductionModel
• For each type of artifact j
Number of Defects Introduced = Aj*(Size)B*QAFj
QAFj = Quality Adjustment Factor for jth artifact =
B = Provisionally set to 1
DRMij = Defect Rate Multiplier for each
COCOMO cost driver and type of artifact jN = 23 (17+1+5) for post-architecture model
N = 13 (7+1+5) for early-design model
DRMiji
N
=∏
1
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 11
Modeling effects of COCOMOcost drivers
Defects Inserted/KDSI or 10FPS
Baseline
Now,
If ACAP is VH &RELY is VH
Requirements
5
How does baselinechange?
Design
25
As comparedto ACAP-VL& RELY-VL
Documentation
15
This leads us to“ A-PrioriSoftware-QualityModel “
Code
15
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 12
Defect Introduction RateSensitivity Example
ACAP (Analyst Capability)Analyst Capability (ACAP)
ACAPlevel
Requirements Design Code Documentation
VH Fewer Requirementsunderstanding defectsFewer RequirementsCompleteness,consistency defects
0.75
Fewer Requirementstraceability defectsFewer DesignCompleteness,consistency defectsFewer defects introducedin fixing defects
0.83
Fewer Coding defects due torequirements, designshortfalls-missing guidelines-ambiguities
0.90
Fewer Documentation defects due torequirements, design shortfalls
0.83Nominal Nominal level of defect introduction
1.0VL More Requirements
understanding defectsMore RequirementsCompleteness,consistency defects
1.33
More Requirementstraceability defectsMore DesignCompleteness,consistency defectsMore defects introducedin fixing defects
1.20
More Coding defects due torequirements, designshortfalls-missing guidelines-ambiguities
1.11
More Documentation defects due torequirements, design shortfalls
1.20Quality Range 1.77 1.45 1.23 1.45Your QualityRangeEstimateComments
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 13
Defect Introduction RateSensitivity Example
RELY (Required Software Reliability)Required Software Reliability (RELY)
RELY level Requirements Design Code DocumentationVH Fewer Requirements Completeness,
consistency defects due to detailedverification, QA, CM, standards,SSR, documentation, IV&Vinterface, test plans, procedures
0.67
Fewer Design defects due todetailed verification, QA, CM,standards, PDR, documentation,IV&V interface, designinspections, test plans,procedures
0.67
Fewer Coding defects due todetailed verification, QA, CM,standards, documentation,IV&V interface, codeinspections, test plans,procedures
0.67
Fewer Documentation defectsdue to requirements, designshortfalls
0.75Nominal Nominal level of defect introduction
1.0VL More Requirements Completeness,
consistency defects due to minimalverification, QA, CM, standards,PDR, documentation, IV&Vinterface, test plans, procedures
1.50
More Design defects due tominimal verification, QA, CM,standards, PDR, documentation,IV&V interface, designinspections, test plans,procedures
1.50
More Coding defects due tominimal verification, QA, CM,standards, PDR,documentation, IV&Vinterface, code inspections, testplans, procedures
1.50
More Documentation defectsdue to requirements, designshortfalls
1.33Quality Range 2.24 2.24 2.24 1.77Your QualityRangeEstimateComments
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 14
Quality Range (Reqts Defects)
Quality Range due to Introduction of Requirements Defects1.0 21.5 2.5
2.24
2.24
Disciplined Methods
Required Software Reliability
2.05
1.77
Precedentedness
Analyst Capability
1.75 Product Complexity
1.75 Process Maturity
1.74 Team Cohesion
1.74 Architecture/Risk Resolution
1.56 Applications Experience
1.45 Multisite Development
1.45
1.45
1.45
Documentation match to Life-Cycle needs
Platform Volatility
Personnel Continuity
Required Development Schedule1.39
1.22 Personnel Experience
Language and Tool Experience
Use of Software Tools
Data Base Size
1.18
1.18
1.15
1.02
1.08
Execution Time Constraint
Main Storage Constraint
Required Reliability
1.08
Development Flexilbility and Programmer Capabiity = 1.0
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 15
Candidate Defect RemovalActivities (Refer to COCOMO II Data
Collection Questionnaire)• Project Reviews
• Artifact Inspections, PeerReviews
• Prototyping
• Simulation
• Automated Reqts. Aids
• Automated Design Aids
• Design Standards
• Unit Testing
• Coverage Testing
• Integration Testing
• Stress Testing
• System Testing
• Beta Testing
• Cleanroom
etc.
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 16
Defect Data Reporting Scheme
Defectsof Type:
Rqts
Design
Code
Documentation
RqtsCode &Unit Test
SW Integ& Test
OtherSW Accept.Test
SystemImpl. &Test
Post Opera-tional
50/30/.2 2/6/1.0
‘‘‘
‘‘‘
Design
20/20/.5
200/100/ .4
Detected / Resolved in Phase / Cost to Resolve by Activity:
0/7/1.5 0/5/.152/6/4
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 17
Presentation Outline
• Motivation
• The Software Defect Introduction and Removal Model
• A-Priori Software Quality Model
=> Plans
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 18
Plans• Iterate A-Priori Model with Affiliates
– Would like feedback on Modeling effects of COCOMO costdrivers, proposed defect removal model
– Initiating Delphi process for iterating multiplier values
• Exploratory data collection & analysis
• Refinement of Model– Identify and Consolidate highly correlated model parameters
• Calibration & iteration of model– Statistically determine estimates of consolidated model parameters
from data (Poisson regression techniques)– Use data-determined model parameters to adjust a-priori model
parameters
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 19
What is a Defect?• “a deviation between desired result and
observed result” - Tom DeMarco
• produces incorrect results or causes systemfailure
• Classification based on Severity– Major (e.g. causes a malfunction, causes approval of
change request)
– Minor (e.g. violation of standards, guidelines)
– Trivial (e.g. spelling, punctuation)
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 20
• • •
Documentation Errors (15/KDSI)
ResidualSoftwareErrors
Code Errors (15/KDSI)
Requirements Errors (5/KDSI)
Design Errors (25/KDSI)
Overall error rate: 60/KDSI
• • •
Percent oferrors
eliminated
Cost, C C C C C
% % % %
Automatedrequirements
aids
Independentrequirements V &V
activity
Simulation DesignInspections
Fieldtesting
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 21
Insight on Determining Ship Time
Cost↑ Defect Rate↓ But do the benefits↑?
Windows 95 had problems of when to ship
TTM is critical
²d
²C
time
cost
ship?ship?
Residualdefects
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 22
Assessment of Quality InvestmentPayoffs
Find a good operating point
Investment Cost↓ Quality↓ No. of defects left↑
Good Enough?
100%
Quality(% of total defects eliminated)
Investment Costs
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 23
Understanding of Quality StrategyInteractions
Inspections, CASE Tools, Unit Testing, Integration Testing,
Regression Testing, Alpha & Beta Testing, IV&V,
Cleanroom
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 24
Integrated Cost/Quality Model
Software product size estimate
Software product, process, computer, and personnel attributes
Software reuse, maintenance,and increment parameters
Software organization’sproject data
Software development,maintenance cost andschedule estimates
Cost, schedule distributionby phase, activity, increment
COCOMO recalibrated to organization’s data
QualityModel
Software QualityDefects/KDSI or 10FPs
Defect removal activity levels
COCOMO
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 25
Post-Architecture DefectIntroduction Multiplier Ranges
Requirements Design Code DocumentationRELY 0.67 - 1.50 0.67 - 1.50 0.67 - 1.50 0.75 - 1.33DATA 1.10 - 0.96 1.15 - 0.94 1.15 - 0.94 1.10 - 0.96CPLX 1.40 - 0.80 1.52 - 0.76 1.52 - 0.76 1.19 - 0.89RUSE 1.0 - 0.98 1.0 - 0.96 1.0 - 0.96 1.06 - .098DOCU 0.83 - 1.20 0.83 - 1.20 0.83 - 1.20 1.15 - 0.87TIME 1.08 - 1.0 1.15 - 1.0 1.15 - 1.0 1.0-1.0STOR 1.08 - 1.0 1.15 - 1.0 1.15 - 1.0 1.0-1.0PVOL 1.20 - 0.83 1.20 - 0.83 1.20 - 0.83 1.15 - 0.87ACAP 0.75 - 1.33 0.83 - 1.20 0.90 - 1.11 0.83 - 1.20PCAP 1.0-1.0 0.90 - 1.11 0.75 - 1.33 0.87 - 1.15PCON 0.83 - 1.20 0.80 - 1.25 0.75 - 1.33 0.80 - 1.25AEXP 0.80 - 1.25 0.80 - 1.25 0.87 - 1.15 0.80 - 1.25PEXP 0.90 - 1.11 0.87 - 1.15 0.87 - 1.15 0.90 - 1.11LTEX 0.92 - 1.09 0.90 - 1.11 0.83 - 1.20 0.92 - 1.09TOOL 0.92 - 1.09 0.92 - 1.09 0.80 - 1.25 0.80 - 1.25SITE 0.83 - 1.20 0.83 - 1.20 0.83 - 1.20 0.83 - 1.20SCED 0.90 - 1.25 0.85 - 1.20 0.85 - 1.20 0.85 - 1.20DISC 0.67 - 1.50 0.60 - 1.67 0.50 - 2.00 0.75 - 1.33PREC 0.65 - 1.33 0.72 - 1.25 0.76 - 1.20 0.76 - 1.20FLEX 1.0-1.0 1.0-1.0 1.0-1.0 1.0-1.0RESL 0.72 - 1.25 0.65 - 1.33 0.65 - 1.33 0.72 - 1.25TEAM 0.72 - 1.25 0.76 - 1.20 0.81 - 1.15 0.81 - 1.15PMAT 0.72 - 1.25 0.55 - 1.50 0.55 - 1.50 0.72 - 1.25
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 26
Defect Removal Model : ProposedApproach
• Modeling cost effectiveness of individual strategies
• Modeling interaction between strategies
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 27
Modeling cost effectiveness of individual strategies
• Not much is known about individual defect removalfunctions
• To complete picture: production functions for other defectremoval activities desired
20 40 60 80 100
20
40
60
80
100
Percent of programming effort
Unit Test
CodeInspection
USC
C S EUniversity of Southern CaliforniaCenter for Software Engineering
Sunita Devnani-Chulani 28
Modeling interaction betweenstrategies
Technique
Strengths
Weaknesses
Inspections
• Simple programming blunders,logic defects
• Developer blind spots• Interface defects• Missing portions• Specification defects
• Numerical approximations• Program dynamics defects
Unit Test
• Simple programming blunders,logic defects
• Numerical approximations• Program dynamics defects
• Developer blind spots• Interface defects• Missing portions• Specification defects
Developing a composite production function is not easy