July 2002
COSYSMO-IP COnstructive SYStems Engineering Cost
Model – Information Processing
PSM User’s Group Conference
Keystone, Colorado
July 24 & 25, 2002
Dr. Barry BoehmRicardo ValerdiUniversity of Southern CaliforniaCenter for Software Engineering
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Version 3
July 2002 2
Outline – Day 1• USC Center for Software Engineering
• Background & Update on COSYSMO-IP
• Ops Con & EIA632
• Delphi Round 1 Results
• Updated Drivers
• Lessons Learned/Improvements
• LMCO & INCOSE Comments
• Q & A
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 3
Outline – Day 2• Review of yesterday’s modified slides
to clarify terminology
• A few new slides to emphasize points
• Review of current driver definitions
• Definition for two new Cost drivers– Technology Maturity– Physical system/information system
tradeoff analysis complexity
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 4
Objectives of the Workshop• Agree on a Concept of Operation• Converge on scope of COSYSMO-IP model• Address definitions of model parameters• Discuss data collection process
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 5
• 8 faculty/research staff, 18 PhD students• Corporate Affiliates program (TRW, Aero
Galorath, Raytheon, Lockheed, Motorola, et al)
• 17th International Forum on COCOMO and Software Cost Modeling October 22-25, 2002, Los Angeles, CA – Theme: Software Cost Estimation and Risk
Management
• Annual research review in March 2003
July 2002 6
COSYSMO-IP: What is it?The purpose of the COSYSMO-IP project
is to develop an initial increment of a
parametric model to estimate the cost of
system engineering activities during system
development.
The focus of the initial increment is on the
cost of systems engineering for information
processing systems or subsystems.
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 7
What Does COSYSMO-IP Cover?• Includes:
– System engineering in the inception, elaboration, and construction phases, including test planning
– Requirements development and specification activities
– Physical system/information system tradeoff analysis
– Operations analysis and design activities
– System architecture tasks• Including allocations to
hardware/software and consideration of COTS, NDI and legacy impacts
– Algorithm development and validation tasks
• Defers:– Physical system/information
system operation test & evaluation, deployment
– Special-purpose hardware design and development
– Structure, power and/or specialty engineering
– Manufacturing and/or production analysis
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 8
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Candidate COSYSMO Evolution Path
Inception Elaboration Construction TransitionOper Test & Eval
1. COSYSMO-IP
2. COSYSMO-C4ISR
3. COSYSMO-Machine
4. COSYSMO-SoS
IP (Sub)system
C4ISR System
Physical MachineSystem
System of Systems (SoS)
July 2002 9
Current COSYSMO-IP Operational Concept
COSYSMO-IPEffort
Duration
Calibration
WBS guidedBy EIA 632
SizeDrivers
EffortMultipliers
# Requirements# Interfaces# Scenarios# AlgorithmsVolatility Factor…
- Application factors- Team factors- Schedule driver
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 10
EIA632/COSYSMO-IP MappingCOSYSMO-IP Category EIA632 RequirementSupplier Performance 3Technical Management 4-12Requirements Definition 14-16Solution Definition 17-19Systems Analysis 22-24Requirements Validation 25-29Design Solution Verification 30End Products Validation - COTS 33a
USC
C S E University of Southern CaliforniaCenter for Software Engineering
EIA632 Reqs. not included in COSYSMO-IP are: 1,2,13,20,21,31,32,33b
July 2002 11
Development EIA Stage Inception
Elaboration
Construction
Transition Management
14
12
10
14 Environment/CM
10
8
5
5 Requirements
38
18
8
4 Design
19
36
16
4 Implementation
8
13
34
19 Assessment
8
10
24
24 Deployment
3
3
3
30 TBD TBD TBD
Activity Elements Covered by EIA632, COCOMOII, and COSYSMO-IP
USC
C S E University of Southern CaliforniaCenter for Software Engineering
= COCOMOII = COSYSMO-IP
When doingCOSYSMO-IP andCOCOMOII, Subtract grey areasprevent doublecounting.
July 2002 12
Past, Present, and Future
Initial set if parameters
compiled by Affiliates
2001 2002 2003
Performed First Delphi Round
Working Group meeting at ARR
PSM Workshop
Meeting at CCII Conference
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 13
Future Parameter Refinement Opportunities
2004 20052003
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Driver definitions
Data collection (Delphi)
Model calibration
First iteration of model
July 2002 14
Delphi Survey• Survey was conducted to:
– Determine the distribution of effort across effort categories– Determine the range for size driver and effort multiplier
ratings– Identify the cost drivers to which effort is most sensitive to– Reach consensus from a sample of systems engineering
experts• Distributed Delphi surveys to Affiliates and received 28
responses• 3 Sections:
– Scope, Size, Cost• Also helped us refine the scope of the model elements
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 15
System Engineering Effort Distribution
Category (EIA Requirement)Supplier Performance (3) Technical Management (4-12)Requirements Definition (14-16)Solution Definition (17-19)Systems Analysis (22-24)Requirements Validation (25-29)Design Solution Verification (30)End Products Validation (33a)
Delphi5.2%
13.1%16.6%18.1%19.2%11.3%10.5%
6.6%
Suggested5%
15%15% 20%20%15%
5%5%
3.054.254.544.285.974.586.073.58
Std. Dev.
Delphi Round 1 Results
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 16
Delphi Round 1 Highlights (cont.)Range of sensitivity for Size Drivers
# A
lgo
rith
ms
# R
eq
uir
emen
ts
# In
terf
aces
# T
PM
’s
# S
cen
ario
s
# M
od
es
# P
latf
orm
s
5.57
Relative
Effort
1
2.232.54
2.212.10
6.48
6
4
2
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 17
Two Most Sensitive Size Drivers
Suggested Rel. Effort
Delphi Respondents EMR
Rel. Effort
Standard Deviation
# Interfaces 4 5.57 1.80
# Algorithms 6 6.48 2.09
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 18
Delphi Round 1 Highlights (cont.)
Range of sensitivity for Cost Drivers (Application Factors)
EMR
1.93
2.812.13
2.432.24
4
2
Req
uir
em
ents
un
d.
Arc
hit
ectu
re u
nd
.
Lev
el o
f se
rvic
e re
qs.
Leg
acy
tra
nsi
tio
n
CO
TS
Pla
tfo
rm d
iffi
cult
y
Bu
s. p
roce
ss r
een
g.
1.741.13
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 19
Delphi Round 1 Highlights (cont.)Range of sensitivity for Cost Drivers (Team Factors)
1.28
2.461.91 2.161.94
1.25
To
ol
sup
po
rt
Sta
keh
old
er c
om
m.
Sta
keh
old
er c
oh
esio
n
Per
son
nel
cap
ab
ilit
y
Per
son
al
exp
erie
nce
Pro
cess
mat
uri
ty
Mu
ltis
ite
coo
rd.
Fo
rmal
ity
of
del
iv.
1.841.78
EMR4
2
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 20
Suggested EMR
Delphi Respondents EMR
MeanStandard Deviation
Arch. Under. 1.66 2.24 0.83Reqs. Under. 1.73 2.43 0.70Pers. Cap. 2.15 2.46 0.66Serv. Req. 2.5 2.81 0.67
Four Most Sensitive Cost Drivers
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 21
4 Size Drivers1. Number of System Requirements
2. Number of Major Interfaces
3. Number of Operational Scenarios
4. Number of Unique Algorithms
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Number of Technical Performance Measures
Number of Modes of OperationNumber of Different Platforms
COSTDriver
COSTDriver
July 2002 22
Size Driver Definitions (1 of 4)
Number of System RequirementsThe number of requirements taken from the system
specification. A requirement is a statement of capability or
attribute containing a normative verb such as shall or will. It
may be functional or system service-oriented in nature
depending on the methodology used for specification. System
requirements can typically be quantified by counting the
number of applicable shall’s or will’s in the system or
marketing specification.
Note: Use this driver as the basis of
comparison for the rest of the drivers.
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 23
Size Driver Definitions (2 of 4)
Number of Major InterfacesThe number of shared major physical and logical
boundaries between system components or functions
(internal interfaces) and those external to the system
(external interfaces). These interfaces typically can be
quantified by counting the number of interfaces
identified in either the system’s context diagram and/or by counting
the significant interfaces in applicable Interface Control
Documents.
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 24
Size Driver Definitions (3 of 4)Number of Operational Scenarios*The number of operational scenarios** that a system is specified tosatisfy. Such threads typically result in end-to-end test scenariosthat are developed to validate the system satisfies its requirements.The number of scenarios can typically be quantified by countingthe number of end-to-end tests used to validate the systemfunctionality and performance. They can also be calculated bycounting the number of high-level use cases developed as part ofthe operational architecture.
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Number of Modes of Operation (to be merged with Op Scen)The number of defined modes of operation for a system. For example, in a radar system, the operational modes could be air-to-air, air-to-ground, weather, targeting, etc. The number of modes is quantified by counting the number of operational modes specified in the Operational Requirements Document.
*counting rules need to be refined **Op Scen can be derived from system modes
July 2002 25
Size Driver Definitions (4 of 4)Number of Unique AlgorithmsThe number of newly defined or significantly altered functions thatrequire unique mathematical algorithms to be derived in order toachieve the system performance requirements.
Note: Examples could include a complex aircrafttracking algorithm like a Kalman Filter being derived using existingexperience as the basis for the all aspect search function. AnotherExample could be a brand new discrimination algorithm beingderived to identify friend or foe function in space-basedapplications. The number can be quantified by counting the
numberof unique algorithms needed to support each of the mathematicalfunctions specified in the system specification or mode descriptiondocument (for sensor-based systems).
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 26
12 Cost Drivers
1. Requirements understanding
2. Architecture complexity
3. Level of service requirements
4. Migration complexity
5. COTS assessment complexity
6. Platform difficulty
7. Required business process reengineering
8. Technology Maturity
9. Physical system/information subsystem tradeoff analysis complexity
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Application Factors (5)
July 2002 27
Cost Driver Definitions (1,2 of 5)Requirements understanding The level of understanding of the system requirements
by all stakeholders including the systems, software, hardware,
customers, team members, users, etc…
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Architecture complexity The relative difficulty of determining and managing the
system architecture in terms of IP platforms, standards,
components (COTS/GOTS/NDI/new), connectors
(protocols), and constraints. This includes systems analysis,
tradeoff analysis, modeling, simulation, case studies, etc…
July 2002 28
Cost Driver Definitions (3,4,5 of 5)
Migration complexity (formerly Legacy transition complexity)
The complexity of migrating the system from previous system
components, databases, workflows, etc, due to new technology
introductions, planned upgrades, increased performance, business
process reengineering etc…
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Level of service requirementsThe difficulty and criticality of satisfying the Key Performance Parameters (KPP). For example: security, safety, response time, the “illities”, etc…
Technology MaturityThe relative readiness for operational use of the key
technologies.
July 2002 29
12 Cost Drivers (cont.)
1. Number and diversity of stakeholder communities
2. Stakeholder team cohesion
3. Personnel capability
4. Personal experience/continuity
5. Process maturity
6. Multisite coordination
7. Formality of deliverables
8. Tool support
Team Factors (7)
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 30
Cost Driver Definitions (1,2,3 of 7)
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Stakeholder team cohesion Leadership, frequency of meetings, shared vision, approval cycles,
group dynamics (self-directed teams, project engineers/managers),
IPT framework, and effective team dynamics.
Personnel capability Systems Engineering’s ability to perform in their duties and thequality of human capital.
Personnel experience/continuity The applicability and consistency of the staff over the life of the
project with respect to the customer, user, technology, domain,
etc…
July 2002 31
Cost Driver Definitions (4,5,6,7 of 7)
USC
C S E University of Southern CaliforniaCenter for Software Engineering
Process maturity Maturity per EIA/IS 731, SE CMM or CMMI.
Multisite coordination Location of stakeholders, team members, resources (travel).
Formality of deliverables The breadth and depth of documentation required to be formally
delivered.
Tool support Use of tools in the System Engineering environment.
July 2002 32
Lessons Learned/Improvements
Lesson 1 – Need to better define the scope and future of COSYSMO-IP via Con OpsLesson 2 – Drivers can be interpreted in differentWays depending on the type of programLesson 3 – COSYSMO is too software-orientedLesson 4 – Delphi needs to take less time to fill outLesson 5 – Need to develop examples, rating scales
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 33
The current COSYSMO focus is too software oriented. This is a good point. We propose to change the scope from "software-intensive systems or subsystems" to "information processing (IP) systems or subsystems." These include not just the software but also the associated IP hardware processors; memory; networking; display or other human-computer interaction devices. System engineering of these IP systems or subsystems includes considerations of IP hardware device acquisition lead times, producibility, and logistics. Considerations on non-IP hardware acquisition, producibility, and logistics are considered as IP systems engineering cost and schedule drivers for the IOC version of COSYSMO. Perhaps we should call it COSYSMO-IP.
LMCO Comments
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 34
The COSYSMO project should begin by working out the general framework and WBS for the full life cycle of a general system. We agree that such a general framework and WBS will eventually be needed. However, we feel that progress toward it can be most expeditiously advanced by working on definitions of and data for a key element of the general problem first. If another group would like to concurrently work out the counterpart definitions and data considerations for the general system engineering framework, WBS, and estimation model, we will be happy to collaborate with them.
LMCO Comments (cont.)
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 35
Points of ContactDr. Barry Boehm [[email protected]](213) 740-8163
Ricardo Valerdi [[email protected]](213) 440-4378
Donald Reifer [[email protected]](310) 530-4493
Websiteshttp://valerdi.com/cosysmohttp://sunset.usc.edu
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 36
Backup slides
USC
C S E University of Southern CaliforniaCenter for Software Engineering
July 2002 37
COCOMOII Suite
COCOMOII
COQUALMOCOPSEMO
CORADMO
COSYSMO-IP
COPROMO
COCOTS
For more information visit http://sunset.usc.edu
USC
C S E University of Southern CaliforniaCenter for Software Engineering