modern tools to support dod software - usc

56

Upload: others

Post on 03-Feb-2022

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Modern Tools to Support DoD Software - USC
Page 2: Modern Tools to Support DoD Software - USC

Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation

A DACS State-of-the-Art Report

DACS Report Number 347336

Prepared for the DACS by

Jo Ann Lane and Barry Boehm

University of Southern California

Center for Systems and Software Engineering

941 W. 37th Place, SAL Room 328

Los Angeles, CA 90089-0781

Jo Ann Lane – [email protected]

Barry Boehm – [email protected]

Prepared by:

Data and Analysis Center for Software

ITT Advanced Engineering & Sciences

775 Daedalian Dr.

Rome, New York 13441-4909

Distribution Statement A

Approved for public release: distribution is unlimited

Page 3: Modern Tools to Support DoD Software - USC

REPORT DOCUMENTATION PAGEForm Approved

OMB No. 0704-0188Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS.

1. REPORT DATE (DD-MM-YYYY)31 August 2007

2. REPORT TYPE31 August 2007

3. DATES COVERED (From - To) N/A

4. TITLE AND SUBTITLE

Modern Tools to Support DoD Software Intensive

5a. CONTRACT NUMBERSP0700-98-D-4000

System of Systems Cost EstimationA DACS State of the Art Report

5b. GRANT NUMBER

5c. PROGRAM ELEMENT NUMBERN/A

6. AUTHOR(S)

Jo Ann Lane (USC CSSE)

5d. PROJECT NUMBERN/A

Dr. Barry Boehm (USC CSSE) 5e. TASK NUMBERN/A5f. WORK UNIT NUMBERN/A

7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)University of Southern California Center for Systems and Software Engineering, 941 W. 37th Place, SAL Room 328, Los Angeles CA 90089-0781

ITT Industries, Advanced Engineering & Sciences, 775 Daedalian Dr., Rome, NY 13441-4909

8. PERFORMING ORGANIZATION REPORT NUMBER

DAN 347336

9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)Defense Technical information Center DTICDTIC/AI8725 John J. Kingman Rd., STE 0944 11. SPONSOR/MONITOR’S REPORT

Ft. Belvoir, VA 22060 NUMBER(S)

12. DISTRIBUTION / AVAILABILITY STATEMENTApproved for Public Release, Distribution Unlimited

13. SUPPLEMENTARY NOTES

14. ABSTRACTMany Department of Defense (DoD) organizations are attempting to provide new system capabilities through the net-centric integration of existing software-intensive systems into a new system often referred to as Software-Intensive System of Systems (SISOS). The goal of this approach is to build on existing capabilities to produce new capabilities not provided by the existing systems in a timely and cost-effective manner. Many of these new SISOS efforts such as the Future Combat Systems are of a size and complexity unlike their predecessor systems and cost estimation tools such as the Constructive Cost Model (COCOMO) suite are undergoing significant enhancements to address these challenges. This report describes the unique challenges of SISOS cost estimation, how current tools are changing to support these challenges, as well as on-going efforts to further support SISOS cost estimation needs. This report concentrates heavily on the COCOMO-based models and tools, but 15. SUBJECT TERMSSOFTWARE ENGINEERING TOOLS AND TECHNIQUES, COST ESTIMATION, COST ESTIMATION MODELS , SOFTWARE ENGINEERING, SYSTEMS ENGINEERING,SOFTWARE ENGINEERING PROCESS

16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

18. NUMBER OF PAGES

19a. NAME OF RESPONSIBLE PERSONThomas McGibbon

Page 4: Modern Tools to Support DoD Software - USC

a. REPORTU

b. ABSTRACTU

c. THIS PAGEU UU 55

19b. TELEPHONE NUMBER (include area code)

315-334-4933

Standard Form 298 (Rev. 8-98)Prescribed by ANSI Std. Z39.18

Page 5: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation ii

Table of Contents

Table of Contents................................................................................................................ ii

Abstract ............................................................................................................................... 1

Acknowledgements............................................................................................................. 1

1.0 Executive Summary ................................................................................................ 2

2.0 Introduction............................................................................................................. 4

2.1. Purpose................................................................................................................ 4

2.2. Intended Audience .............................................................................................. 4

3.0 Motivation and Context .......................................................................................... 5

3.1. Motivation for developing system of systems .................................................... 5

3.2. Nature and definition of “system of systems” .................................................... 5

Inadequacy of traditional cost models in estimating system of systems costs ......... 11

4.0 Approaches to SISOS Cost Estimation................................................................. 13

4.1. Architecture-based estimates using parametric models.................................... 13

4.2. Activity-based estimation ................................................................................. 13

4.3. Level of effort ................................................................................................... 13

4.4. Rough order of magnitude ................................................................................ 13

5.0 Elements of a System of Systems Cost Model ..................................................... 14

6.0 COSOSIMO Parameters ....................................................................................... 17

6.1. Overview........................................................................................................... 17

6.2. COSOSIMO PRA Parameters .......................................................................... 18

PRA Size Drivers...................................................................................................... 18

PRA Cost Drivers ..................................................................................................... 19

6.3. COSOSIMO SO Parameters ............................................................................. 23

SO Size Drivers......................................................................................................... 24

SO Cost Drivers ........................................................................................................ 24

6.4. COSOSIMO I&T Parameters ........................................................................... 29

I&T Size Drivers....................................................................................................... 29

I&T Cost Drivers ...................................................................................................... 30

7.0 An Initial Stage-wise SoS Cost Estimation Model ............................................... 37

Page 6: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation iii

7.1. Hybrid Development Process ........................................................................... 37

7.2. Estimation of SISOS Development Effort for a Given Iteration ...................... 40

7.3. Viewing the Hybrid Process in the SISOS Environment.................................. 41

7.4. Combining Agile/Plan-Driven Work in the SISOS Effort Estimates............... 42

7.5. Final Comments on Total SISOS Development Costs ..................................... 42

8.0 Conclusions........................................................................................................... 43

9.0 References............................................................................................................. 44

Appendix A..................................................................................................................... A-1

Appendix B ..................................................................................................................... B-1

Page 7: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 1

Abstract

Many Department of Defense (DoD) organizations are attempting to provide new system capabilities through the net-centric integration of existing software-intensive systems into a new system often referred to as Software-Intensive System of Systems (SISOS). The goal of this approach is to build on existing capabilities to produce new capabilities not provided by the existing systems in a timely and cost-effective manner. Many of these new SISOS efforts such as the Future Combat Systems are of a size and complexity unlike their predecessor systems and cost estimation tools such as the Constructive Cost Model (COCOMO) suite are undergoing significant enhancements to address these challenges. This report describes the unique challenges of SISOS cost estimation, how current tools are changing to support these challenges, as well as on-going efforts to further support SISOS cost estimation needs. This report concentrates heavily on the COCOMO-based models and tools, but also incorporates activities underway by other cost model vendors.

Acknowledgements

The Data & Analysis Center for Software would like to acknowledge the following people for their review of this report

Daniel Ferens, ITT Corporation, DACS Analyst

Robert Vienneau, ITT Corporation, DACS Analyst

Thomas McGibbon, ITT Corporation, DACS Director

Page 8: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 2

1.0 Executive SummaryThe DoD Defense Acquisition Guidebook [DOD, 2006a] and USAF Scientific Advisory Board Report on SoSE for Air Force Capability [USAF, 2005] have defined “system of systems engineering (SoSE)” as “The process of planning, analyzing, organizing, and integrating the capabilities of a mix of existing and new systems into a SoS capability greater than the sum of the capabilities of the constituent parts.” This report adopts this definition, with the interpretation that “integrating” includes the effort involved in constituent-system source selection, acquisition management, strategic partner coordination, change management, and levels of testing involved in ensuring a well-integrated system of systems.

There are many potential advantages in investing in a system of systems. These include avoiding the unacceptable delays in service, conflicting plans, bad decisions, and slow response to fast-moving events involved with current collections of incompatible systems. On the positive side, they enable organizations to see first, understand first, act first, and finish decisively; and rapidly adapt to changing circumstances. However, in assessing the return on investment in a system of systems, one must assess the size of this investment, and these costs are very easy to underestimate. The trade press is full of hype about the ease of composing Web 2.0 mashups out of separately developed components. But these only work when the components are very loosely coupled with easily-worked mismatches; and even then are only as reliable as their least reliable components or connectors.

For organizations such as DoD that must develop high-assurance systems of systems from closely-coupled, often incompatible and independently evolving, often unprecedented systems, the investment costs for SoSE can be extremely high, particularly if inappropriate SoSE strategies are employed. Although not enough data on completed SoS projects is currently available to calibrate models for estimating these costs, enough is known about the SoSE cost sources and cost drivers to provide a framework for determining the relative cost and risk of developing systems of systems with alternative scopes and development strategies before committing to a particular SoS scope and SoSE strategy.

This report presents such a framework and discusses its use. It has three primary sources of cost, each with a set of primary cost drivers. These are:

Planning, Requirements Management, and Analyzing (PRA): Number of SoS-related requirements; number of SoS interface protocols; requirements understanding; level of service requirements; stakeholder team cohesion; PRA team capability; PRA process maturity; PRA tool support; PRA cost/schedule compatibility; and PRA risk resolution preparedness.

Source Selection and Supplier Oversight (SO): Number of independent component system organizations; number of unique component systems; SO requirements understanding; SO architecture maturity; SO level of service requirements; SoSE/supplier team cohesion; SO team capability; SO process maturity; SO tool support; SO process cost/schedule compatibility; and SO risk resolution preparedness.

Page 9: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 3

SoS Integration and Testing (I&T): Number of SoS interface protocols; number of operational scenarios; number of unique component systems; I&T requirements understanding; I&T architecture maturity; I&T level of service requirements; I&T team cohesion; I&T team capability; I&T tool support; I&T process maturity; I&T process cost/schedule compatibility; I&T risk resolution; component system maturity; and component system readiness.

Page 10: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 4

2.0 Introduction

2.1. PurposeAs system development approaches and technologies used to develop systems evolve, techniques and tools used to plan and estimate these system development activities likewise need to evolve. This report is designed to provide guidance to those attempting to estimate effort needed to develop a System of Systems (SoS). Case studies of previous Software-Intensive SoS (SISOS) projects have found that even very well-planned SISOS projects can encounter unanticipated sources of cost growth [Krygiel, 1999]. The intent is that managers and estimators will find this report’s guidance useful when generating estimates to develop or extend SoSs. They should use the guidance and conceptual parameters presented in this report to reason about their SoS and to make sure that their estimates account for all critical cost factors that pertain to their SoS.

This guidance includes an overview of typical SoS development activities, existing parametric modeling tools that can be applied to SoS estimation activities, as well as additional size drivers and cost factors to be considered when preparing estimates. In particular, this report describes:

The unique challenges of SISOS cost estimation

How current tools are changing to support these challenges

On-going efforts to further understand SISOS development processes and the factors that impact the effort to perform these processes

Current techniques and tools to support SISOS cost estimation needs.

This report uses the Constructive Cost Model (COCOMO) family of estimation models as its reference framework, but also incorporates activities underway by other cost model providers.

2.2. Intended AudienceThis report is designed to provide high level insights to managers with overall responsibility for planning and estimating SoS development efforts as well as more detailed guidance for system estimators that are responsible for generating the actual estimates. This technical report assumes that the reader has a basic understanding of software development and system engineering estimation techniques.

Page 11: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 5

3.0 Motivation and Context

3.1. Motivation for developing system of systemsMany organizations are attempting to provide new system capabilities through the net-centric integration of existing systems. Early efforts were more of “mashups” of incompatible systems that too often produced:

Service outages or poor response time

Incompatible decision data and user interfaces

Glacial, inadequate adaptation to change

Inadequate processes to “observe, orient, decide, act” (OODA) when trying to integrate and evolve these systems of systems.

Today, the goal of this approach is to engineer approaches to better build on existing capabilities to produce new capabilities not provided by the existing systems and to do this in a timely and cost-effective manner. With this development approach, system development processes to define the new architecture, identify sources to either supply or develop the required components, and eventually integrate and test these high level components are evolving and are being referred to as SoS Engineering (SoSE).

3.2. Nature and definition of “system of systems” As described in [Lane, 2007], the term “system of systems” has evolved in fairly recent times to describe a subset of large, complex systems. The earliest references in the literature to “systems within systems” or “system of systems” can be found in [Berry, 1964] and [Ackoff, 1971]. These 1960-1970 era SoS concepts are early insights into the evolution of today’s SoS. Even though the term “system of systems” was not commonly used at the time, systems of systems were being successfully developed and deployed. These SoSs are represented by undersea surveillance and weapons systems such as the 1950’s era Integrated Undersea Surveillance System (IUSS) and Sound Surveillance System (SOSUS), that significantly expanded the capabilities of the World War II-era Anti-Submarine Warfare (ASW) system [IUSSCAA, 2006]; the Global Positioning System (GPS) [NAVSTAR, 2006] that is today considered both as a SoS and a component system for other SoSs; and command and control centers used by the military organizations, air controllers, and other agencies responsible for the coordination of multiple resources. As these types of integrated systems became more common, system engineering professionals and researchers began to define and study them as a special class of systems. And, as the term has become a popular way to represent a strategic and economic approach to enhancing existing system capabilities, there are now an abundance of definitions.

A review of recent publications [Lane and Valerdi, 2005] shows that the term “system-of-systems” means many things to many different people and organizations. Many seem to be converging on the definition provided in [Maier, 1998]: an evolutionary net-centric architecture that allows geographically distributed component systems to exchange information and perform tasks within the framework that they are not capable of

Page 12: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 6

performing on their own outside of the framework. The nature of these tasks oftencannot be specified in advance but emerges with use. This is often referred to as “emergent behavior”. In addition, the component systems are typically independently managed, have their own purpose, and can operate either on their own or within the SoS.

In the business domain, an example of an SoS is the enterprise-wide integration and sharing of core business information across functional and geographical areas. In the military domain, an example of an SoS is a dynamic communications infrastructure integrating a set of military platforms to support operations in a constantly changing, sometimes adversarial, environment. For some, an SoS may be a multi-system architecture that is planned up-front by a prime contractor or Lead System Integrator (LSI). For others, an SoS is an architecture that evolves over time, often driven by organization needs, new technologies appearing on the horizon, and available budget and schedule. The evolutionary SoS architecture is more of a network architecture that grows with needs and available resources.

In any case, users and nodes in the SoS network may be either fixed or mobile. Communications between component systems in the SoS are often some combination of service-oriented, point-to-point, or broadcast protocols. Networks may tie together other networks as well as nodes and users. SoS component systems typically come and go over time. As mentioned above, these component systems can operate both within the SoS framework and independent of this framework. In a general sense, it is challenging to define clear boundaries of an SoS because of its dynamic nature. Equally challenging is the process of deciding what systems appropriately deserve the SoS label because, depending on an individual’s system-of-interest, one person’s SoS may be another’s component system in an SoS.

With the SoS development approach, system development processes are evolving and are being referred to as SoSE [SoSECE, 2006; USAF, 2005]. SoSE is that set of engineering activities performed to define the desired SoS-level capabilities, develop the SoS-level architecture, identify sources to either supply or develop the required SoS component systems, then integrate and test these high level components within the SoS architecture framework, with the result being a deployable SoS.

The SoS Systems Engineering (SE) Guide [DoD, 2006b] has identified five categories of SoS:

New Development Programs: These are SoSs that are predominantly comprised of new systems and often developed by a LSI using fairly traditional acquisition practices. Examples of this type are the Army’s Future Combat System (FCS) and the Coast Guard’s Integrated Deepwater System.

Development of New Capability by Integrating Current Systems: In this category, currently fielded systems are integrated to form an SoS. An example of this is the Army Battlefield Command System (ABCS).

Mixed System Maturity Levels: The SoSs in this category are SoSs whose component systems are in different phases of the system life cycle and pose unique issues and challenges when combined into an SoS. Examples of these

Page 13: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 7

SoSs include Naval Integrated File Control Counter Air (NIFC-CA) and Single Integrated Air Picture (SIAP).

Sustainment: These SoSs are those comprised of mature component systems in the sustainment phase of their lifecycles. Efforts focus on the sustainment of an existing capability. An example of this type is the Stryker Brigade Combat Team (SBCT or Stryker).

Business: The SoSs in this category are those that span business enterprises. Examples in this category include supply chain management SoSs and human resource management SoSs. Note that business SoSs in this category may also fit into one of the other four categories.

The rest of this report focuses on the estimation of costs for SoSs in the first three categories above: new development programs, development of new capability by integrating current systems, and mixed system maturity levels. Business systems are also covered to the extent that they fall into one of these first three categories.

Key Activities for LSI Organizations in a Contracting Environment: While SoSs are conceptually simple, they get very complex when dealing with security, safety, and decentralized information management on the technical side and multiple stakeholders and vendors/suppliers on the management side. To better understand LSI activities, SoS projects have been observed and LSI engineers surveyed with respect to the types of issues they typically face [Lane, 2005a]. The following summarizes the key activities and associated common issues.

Once an LSI team is under contract to develop an SoS, they quickly begin to concurrently define the scope of the SoS, plan the activities to be performed, analyze the requirements, and start developing the SoS architecture/framework. As the scope, requirements, and architecture start to firm up, the LSI organization begins source selection activities to identify the desired component system suppliers. Then, as the suppliers start coming on board, the LSI organization must focus on teambuilding, re-architecting, and feasibility assurance with the selected suppliers. Teambuilding is critical since the LSI organization and the selected suppliers may have been competitors in the past and now must work together as an efficient, integrated team. Re-architecting is often necessary to make adjustments for the selected system components that may not be compatible with the initial SoS architecture or other selected components. And feasibility assurance is conducted to better evaluate technical options and their associated risks. Many of the technical risks in an SoS framework are due to incompatibilities between different system components or limitations of older system components with today’s technology.

As the SoS development teams begin to coalesce, the LSI organization focuses on incremental acquisition activities for the development and integration/test of the required component systems for the SoS. During this process, there are often continuous changes and new risks that must be managed. In addition, the LSI organization is continuously looking for opportunities to simplify the SoS architecture and reduce effort, schedule, and risks. Key management issues include:

Page 14: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 8

Number of stakeholders – The stakeholders in an SoS development effort are numerous. They come from sponsoring and funding organizations as well as the various user communities that have high expectations for the planned SoS.

Number of development organizations – Because the component systems are often “owned” by an organization other than the sponsoring or LSI organizations, there is often a separate development organization associated with eachcomponent system in the SoS. In addition, some of the component systems can be systems of systems in their own right. This means that there may be lower level suppliers associated with each component system, adding to the number of development organizations.

Number of decision “approvers” – Studies [Blanchette, 2005; Pressman and Wildavsky, 1973] have shown that as the number of people involved in the decision-making process increases, the probability of getting a timely (or even any decision) often decreases. In the SoS development arena, the stakeholders, the system component “owners”, and the LSI organization are often all involved in making key decisions.

Cross-cutting risks – These are risks that cut across organizational boundaries and/or component systems (as opposed to component system risks that can be managed by the component system supplier). A key to a successful SoS is negotiating solutions that are optimal for the SoS, and not necessarily optimal forsome of the component systems. This requires component system stakeholders or suppliers to sometimes implement changes for the SoS which are to their disadvantage.

Schedules – A key feature of SoSs is that the component systems within an SoS are typically independently owned and managed by another organization. This means that SoS timelines are often controlled by other “outside” goals and timelines. SoS-enabling features are often incorporated into SoS components along with the other enhancements and features planned by the component “owner”. There may be long-lead enhancements that are not required by the SoS architecture/system, but are more important to the component owner or user organization and will delay implementation of the SoS features. Also, these other on-going changes (not required for the SoS) may impact the stability of the component (including its architecture). A current example of this is the limited resources and specialists available to develop new features needed to support today’s Iraq operations (FCS spin outs) vs. features needed to support the Army’s FCS SoS in the future [GAO, 2006]. While this may be perceived as primarily a schedule issue, it can also impact effort since component delivery delays can result in inefficient integration activities and significant rework.

As LSI organizations try to scale up their Traditional SE (TSE) management processes, they find that there are often new and unexpected issues. Typical management issues include [Lane, 2005b]:

Page 15: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 9

Traditional planning and scheduling may lead to unacceptably long schedules, requiring the LSI organization to be more creative in both their technical and implementation approaches.

Planning and tracking activities must integrate inputs from a variety of different organizations, each with its own (and probably different) process.

Traditional oversight and coordination can spread key LSI personnel too thin.

More emphasis is required for contracting and supplier management. Incentives are often needed to better align priorities and focus of the component system supplier organizations. In addition, contracts must provide mechanisms to allow suppliers to participate more in the change management process to help assess impacts and to develop efficient approaches to proposed changes.

Standardization of all processes may be overwhelming. The LSI organization needs to decide what to standardize and what to let the suppliers control.

The decision making process involves considerably more organizations. As mentioned above, this can make the decision making process much more complex and time-consuming and it may have significant impacts on the overall schedule and effort.

Risk management for cross-cutting risks needs to cross organizational boundaries. It is important that risk management activities for cross-cutting risks don’t select strategies that are optimal for one area of the SoS, but are to the detriment of other areas. The focus must be on the overall SoS.

Since SoS development efforts usually span many years and include many incremental or evolutionary developments, there are opportunities for the SoSE organization to adapt and mature their processes to the SoSE environment. One of the key observations of evolving SoSE processes is how LSI organizations are attempting to blend traditional processes with more agile processes [Madachy et al, 2006]. They are more agile when dealing with risk, change, and opportunity management for future increments, but plan for stabilized evolutionary increments in the near term. Key to this approach is knowing when to plan, control, and stabilize and when to be more flexible, agile, and streamlined. The agile teams are responsible for performing acquisition intelligence, surveillance, and reconnaissance functions, and then rebaselining future increment solutions as necessary. (See [Boehm and Turner, 2004] for additional information on blending traditional and agile processes.)

Key SoSE Activities Performed by Government SoSE Teams: Another development approach for government-owned SISOSs is to have a government organization responsible for SoSE activities instead of an LSI. These are primarily government program offices that provide engineering oversight teams. Often these government teams are augmented using support contractors with various specialty skills. These SoSE teams are often used for SoSs where new capabilities are developed by integrating current

Page 16: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 10

systems. New and existing system interfaces may be used to integrate the current systems. In addition, new component(s) may be developed to perform data or protocol conversions between component systems to enable integration.

In this situation, little development is performed by the SoSE team. Rather, all of the SoS development activities are performed by suppliers or vendors. There is still a focus on incremental acquisition activities, change management, and risk management. The key management issues are also applicable with this approach. However, there is not as much of an emphasis on source selection for components since the motivation for this SoS is the integration of existing systems in the domain.

Reported Differences Between SoSE and TSE: Many have reported on differences between TSE and SoSE in recent reports and conferences. Most of these differences are in the areas of architecting; prototyping, experimentation, and tradeoffs; and SoS scope and performance. Several [Meilich, 2006; USAF SAB, 2005] have stated that SoS architecting must focus more on composability than traditional design by decomposition (i.e., SoSs are developed by identifying existing systems and integrating them into an SoS as opposed to developing the SoS in a top-down manner through more traditional functional decomposition processes) and that architectures are net-centric as opposed to hierarchical. It has also been noted that in order to successfully develop an SoS, there must be intense concept phase analysis, followed by continuous anticipation of change,and supported by on-going experimentation [USAF, 2005]. Extensive modeling and simulation are also required to better understand emergent behaviors [Finley, 2006] and to support early, first order tradeoffs at the SoS level and evaluations of alternatives [Garber, 2006; Finley 2006]. Over the long term, [USAF, 2005] reports that it will be important to discover and utilize standard convergence protocols that will “SoS-enable” candidate component systems and support their incorporation into multiple SoSs. SoSs also seem to extend the concepts of system flexibility and adaptability [USAF, 2005] and it has become clear to some that the human must be considered as part of the SoS [Siel 2006; Meilich, 2006; USAF SAB, 2005]. Finally, SoSs are designed to be dynamically reconfigured as needs change [USAF, 2005] and therefore, the organizational scope of the SoS is defined at runtime instead of during system development [Meilich, 2006].

In the DoD arena, many key challenges for SoSE have been observed. It can be difficult to get the necessary commitment and cooperation between multiple government organizations, the SoS proponents, and the associated suppliers/vendors. Therefore, new business models and incentives are needed to encourage working together at the SoS level [Garber, 2006]. This also requires accountability at the SoS enterprise level and the removal of multiple decision-making layers [Pair, 2006]. Often in the early stages of a large program, there is an urgency and a temptation to take shortcuts. However, experience has shown that in the case of SoSs, it is important to take the time to do the necessary analyses and tradeoffs [Garber, 2006] and to focus on commonality of data, architecture compatibility, and business strategies at the SoS level [Pair, 2006] as well as human-system integration [Siel, 2006, Meilich, 2006], technology maturity [Finley, 2006], and the necessary evolutionary management of the SoS [Boehm, 2006; Meilich, 2006].

Page 17: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 11

Inadequacy of traditional cost models in estimating system of systems costsWhen organizations started using SoS concepts to evolve and expand the capabilities of their existing systems, they found that their cost estimation tools covered part of the SoS development activities, but not all of them. If an organization decides to acquire or develop a new system to integrate into the SoS (or to facilitate the integration of existing systems into an SoS), then existing systems engineering, software development, and/or Commercial Off-the-Shelf (COTS) integration cost models can be used to estimate the effort associated with the acquisition/development of the system component. An example of this might be a new “translator” component that converts data between different formats so that no modifications are needed for legacy components. Likewise, if changes must be made to existing (legacy) systems in order to enable SoS connectivity or implement new features desired for SoS-level capabilities, the existing cost models can be used to estimate the effort associated with these system-level changes.

What is not covered by existing cost models is the effort associated with the development of the SoS concepts and architecture, the analysis required to identify the desired SoS component systems, and the integration and test of those component systems in the SoS environment. Figure 1 shows the home-grounds of various cost models, as well as highlights the fact that SoSE activities are currently not specifically addressed by existing cost models. Further, the sizing inputs used by the existing models (e.g., number of requirements, function points, lines of application or COTS glue code) are not well-matched to SoSE sources of effort or sources of information.

Figure 1. Suite of Available Cost Models to Support SISOS Effort Estimation

In addition, [Wilson, 2007] provides a comprehensive analysis of several parametric tools either currently being used or under development to support the estimation of SoS development effort. Many of the tools that Wilson analyzed are adaptations of the software and systems engineering tools shown in Figure 1. His conclusion at this point in time is that SoSs are poorly understood and that the tools and thought processes needed to address the development of these systems are incomplete. As the industry begins to better understand SoSs and SoSE, these tools will evolve to provide cost model capabilities that better cover the broader SISOS domain. The goals for these tools are to:

Page 18: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 12

1. Reduce the risk of underestimating or overestimating the resources needed to support investment in large technology-intensive SoSs

2. Explore alternatives and support trade-off activities

3. Understand the sensitivities of the different cost drivers of SoSE.

The rest of this report describes in more detail the current approaches to SISOS Cost estimation, elements of a SoS cost model, and how the SoS cost models can be used to estimate the evolutionary stages of SISOS development.

Page 19: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 13

4.0 Approaches to SISOS Cost Estimation

As with software cost estimation, there are many approaches to estimating SISOS development effort based on the characteristics of the SoS product, funding mechanisms, the life cycle model used to develop the software product, and current state of the SoS. The following describes how some of the key estimation approaches can be applied to SISOS development.

4.1. Architecture-based estimates using parametric modelsParametric models such as those in the COCOMO [Boehm et al., 2005], SEER [Galorath, 2001], SLIM [QSM, 2006], and PRICE [PRICE, 2006] suite of tools can be used to estimate the effort to develop new SoS component systems, modify existing component systems, or tailor COTS products. These estimates are then combined with the effort to perform the SoSE activities at either the LSI or government oversight level.

4.2. Activity-based estimationSome SoSE activities are better estimated using a bottom-up, activity-based estimation approach. For example, SoS architecting activities may be based on the number of anticipated capabilities to be implemented. A nominal effort value is determined for analyzing and “architecting” each capability, then this value is used to estimate the total effort for the overall SoS architecting activity. A similar process is used to develop estimates for the other SoSE activities, then the effort values associated with all of the lower-level activities are summed together to provide an overall estimate.

4.3. Level of effortFor those SoSs that have reached the operations and maintenance or sustainment phase, often annual budgets are established by determining an appropriate level of effort. The main activities in these phases are configuration management, change control, periodic product upgrades, minor enhancements, and necessary problem resolution. Level of effort budgets are often adjusted based on upgrade priorities and risk analysis.

4.4. Rough order of magnitudeIn the early concept definition and exploration phases, often few details are known about the SoS or the actual component system suppliers. However, decision makers need to have some understanding of the target system costs. Several techniques can be used to generate a Rough Order of Magnitude (ROM) estimate. These include estimation by analogy where costs are based on an existing system development effort of similar size, scope, and technology. Or it may be based on early architecture-based size drivers such as number of operational nodes, mission-level operational scenarios, operational activities, nodal information exchange boundaries, key technologies, member systems, and peer systems as described in [Wang et al, 2007].

Page 20: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 14

SOS

SmS2 (SoS)S1

S11 S12 S1n S21 S22 S2n Sm1 Sm2 Smn

……

…… …… ……

Level 0

Level 1

Level 2

SOS

SmS2 (SoS)S1

S11 S12 S1n S21 S22 S2n Sm1 Sm2 Smn

……

…… …… ……

Level 0

Level 1

Level 2

COCOMO IILevel 0Development of SoS Software-Intensive Infrastructure and Integration Tools

COCOTSLevels 1-nCOTS Assessment and Integration for COTS-based Components

COCOMO II

COSYSMO

COSOSIMO

Cost Model

Levels 1-nSoftware Development for Software-Intensive Components

Levels 1-nSystem Engineering for SoS Components

Level 0, and other levels if lower level systems components are also SoSs (e.g., S2)

SoS Lead System Integrator Effort (SoS scoping, planning, requirements, architecting; source selection; teambuilding, re-architecting, feasibility assurance with selected suppliers; incremental acquisition management; SoS integration and test; transition planning, preparation, and execution; and continuous change, risk, and opportunity management)

LevelsActivity

COCOMO IILevel 0Development of SoS Software-Intensive Infrastructure and Integration Tools

COCOTSLevels 1-nCOTS Assessment and Integration for COTS-based Components

COCOMO II

COSYSMO

COSOSIMO

Cost Model

Levels 1-nSoftware Development for Software-Intensive Components

Levels 1-nSystem Engineering for SoS Components

Level 0, and other levels if lower level systems components are also SoSs (e.g., S2)

SoS Lead System Integrator Effort (SoS scoping, planning, requirements, architecting; source selection; teambuilding, re-architecting, feasibility assurance with selected suppliers; incremental acquisition management; SoS integration and test; transition planning, preparation, and execution; and continuous change, risk, and opportunity management)

LevelsActivity

Figure 2. SoS Cost Estimation [Lane and Boehm, 2006].

5.0 Elements of a System of Systems Cost Model

As mentioned previously, existing cost models can estimate part of the SISOS development effort. Figure 2 is a hierarchical view of SISOS, showing relationships between SoS component systems and the systems that comprise the component systems. Note that often, components in one SISOS can also be considered as a SISOS when viewed outside the higher-level SISOS, thus giving the higher-level SISOS both a hierarchical and a net-centric architecture view.

Most current approaches to SISOS cost estimation look at both the SoS level as well as the component systems. Often stakeholders are interested in total SoS development costs, not just the cost of the SoSE activities. Figure 2 illustrates how various existing cost models such as those in the COCOMO suite can be used to estimate many aspects of SoS development and evolution. By using this approach, the total SoS development effort becomes the sum of the SoS-level effort from the Constructive SoS Integration Model (COSOSIMO), plus the sum of the effort from all of the other cost models used to estimate the effort associated with required changes to each of the existing SoS component systems, plus the sum of the effort required to develop any new component systems.

In general, parametric cost models such as those shown in figures 1 and 2 have similar characteristics with respect to their inputs, functional forms, and outputs. The inputs consist of a set of size drivers that are used to estimate a nominal effort for the activities of interest plus a set of cost modifiers that provide additional information about the

Page 21: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 15

product to be developed, the processes used to develop the product, and the skills and experience levels of the people that will perform the engineering activities. These cost modifiers are used to adjust the nominal effort up or down, depending on whether the selected value is a positive or negative influence on the nominal effort. As described in [Boehm et al, 2005], the cost estimating relationships (CERs) between the size driver(s) and cost modifiers are reflected in the model in terms of the type of influence it has: additive, multiplicative, or exponential. The CER for a particular parameter in a given cost model is determined through the validation and calibration activities of the cost model development. The COCOMO models that have been validated and calibrated with actual historical data and expert judgment calculate and output a total number of estimated hours. Guidance provided with these models can be used to help estimators distribute these hours over the various phases of development. Other models (or parts of some models) are still in the early stages of validation and calibration, but can still be used as conceptual models to help estimators reason about the set of activities to be estimated. The conceptual models have defined sets of size drivers and cost modifiers along with counting rules for the size drivers and guidance for determining appropriate values for the cost modifiers (typically from “very high” to “nominal” to “very low”) that have been developed through workshops with experts from the University of Southern California (USC) Center for Systems and Software Engineering (CSSE) industry affiliate organizations. In these cases, estimators can use a combination of expert judgment and analogy estimation techniques and adjust these estimates based on the guidance provided in the conceptual models.

The following describes the current state of existing COCOMO cost models that can be used to support the estimation of effort to develop an SoS.

SoS Engineering: COSOSIMO is a conceptual model that can be used to support the estimation of key SoSE activities. These activities include a) planning, requirements, and architecting; b) source selection and supplier oversight; and c) verification and validation. Using the size drivers and cost drivers developed through workshops with CSSE industry affiliates, users can reason about the SoS to be developed and then develop activity-based estimates for each of the keyactivity areas. A more detailed description of the COSOSIMO sub-model size drivers and cost drivers are provided in the following section.

Software Development: COCOMO II is a cost model that estimates the effort and schedule required to develop a software system. It is based on the estimated number of lines of code or function points for the software system and outputs the number of labor hours required for planning, requirements analysis, design, code and unit test, integration and test, and delivery. The current calibrated model is based on 161 data points provided by the CSSE industry affiliates. Additional information on COCOMO II may be found in [Boehm et al, 2000].

Systems Engineering: The Constructive Systems Engineering Cost Model (COSYSMO) is a cost model that estimates the systems engineering effort associated with system development projects. It is based on the number of system requirements, system interfaces, algorithms, and operational scenarios and

Page 22: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 16

outputs the estimated number of systems engineering labor hours for the ANSI/EIA 632 [ANSI/EIA, 1999] standard activities associated with the phases of Conceptualize, Develop, Operational Test and Evaluation, and Transition to Operation. The current calibrated model is based on 40 data points provided by the USC CSSE industry affiliates. Additional information on COSYSMO can be found in [Valerdi, 2005].

COTS Integration: The Constructive COTS (COCOTS) integration cost model is comprised of three parts: a COTS assessment sub-model, a tailoring sub-model, and a glue code development sub-model. The assessment sub-model is a conceptual model used to reason about the cost associated with the identification, assessment, and selection of viable COTS products. The tailoring sub-model is also a conceptual model used to reason about COTS product tailoring that will be required configure the COTS product for use in a specific context. It includes parameter initialization, incorporation of organization-specific business rules, establishment of user groups and security features, screen customization, and report definitions. The glue code sub-model estimates the effort required to integrate the COTS product into a larger system or enterprise. The glue code sub-model is based on 20 data points provided by the USC CSSE industry affiliates. Additional information on COCOTS can be found in [Abts, 2004].

Both Price and SEER are early adopters of this approach for SoS cost estimation, using their existing cost estimation tools to estimate effort associated with the development and modification of SoS components, then using non-parametric techniques and aspects of the COSOSIMO conceptual model to complete the SoS effort estimate.

Page 23: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 17

Planning, Requirements Management, and Architecting

Source Selection and Supplier Oversight

SoS Integration and Testing

Inception Elaboration Construction Transition

Figure 3. Conceptual Overview of COSOSIMO Sub-Models.

6.0 COSOSIMO Parameters1

6.1. OverviewCOSOSIMO is designed to estimate the effort associated with the LSI or SoSE team activities to define the SoS architecture, identify sources to either supply or develop the required SoS component systems, and eventually integrate and test these high level component systems. (Note: The term LSI is used in this section to refer to either LSI or SoSE teams.) For the purposes of this cost model, an SoS is defined as an evolutionary net-centric architecture that allows geographically distributed component systems to exchange information and perform tasks within the framework that they are not capable of performing on their own outside of the framework. The component systems may operate within the SoS framework as well as outside of the framework, and may dynamically come and go as needed or available. In addition, the component systems are typically independently developed and managed by organizations/vendors other than the SoS sponsors or the LSI.

Results of recent COSOSIMO workshops with USC CSSE industry affiliates have resulted in the definition of three COSOSIMO sub-models: a planning/requirements management/architecture (PRA) sub-model, a source selection and supplier oversight (SO) sub-model, and an SoS integration and testing (I&T) sub-model. The conceptual effort profiles for each sub-model are shown in Figure 3.

This section describes the parameters for each of the COSOSIMO sub-models. The parameters include a set of size drivers that are used to calculate a nominal effort for the sets of

activities associated with the sub-model and a set of cost drivers that are used to adjust the nominal effort based on related SoS architecture, process, and personnel characteristics. Each size driver description includes a definition of the parameter as well

1 Reprinted from USC-CSE-TR-2006-606 [Lane, 2006] with permission.

Page 24: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 18

as associated counting rules and guidance for assigning complexity ratings. Each cost driver description includes a definition of the parameter as well as guidance for assigning the appropriate rating factor. Note that several of the COSOSIMO parameters have several aspects defined. In these cases, when defining values for a given parameter, the estimator should review the associated aspects, determine values for the various aspects, and then combine the aspect values into a single parameter value by weighting the relative importance of each aspect for the SoS of interest and using the weights to determine the most appropriate value for the given parameter.

Finally, COSOSIMO workshop findings indicate that some of the SoS LSI activities are similar to systems engineering activities addressed by COSYSMO and have similar size and cost drivers. Therefore, some of the COSOSIMO parameter definitions are adapted from the COSYSMO definitions in [Valerdi, 2005] and are indicated by a footnote.

6.2. COSOSIMO PRA ParametersThe LSI PRA activities are those associated with SoS concept development; requirements identification, analysis, and evolution; SoS architecture development and evolution, as well as the long term planning for providing incremental SoS capabilities in accordance with the SoS sponsor’s cost and schedule targets.

PRA Size DriversNumber of SoS-Related Requirements2: The number of requirements for the SoS of interest at the SoS level. Requirements may be functional, performance, feature, or service-oriented in nature depending on the methodology used for specification. They may also be defined by the customer or contractor. SoS requirements can typically be quantified by counting the number of applicable shalls, wills, shoulds, and mays in the SoS or marketing specification. Note that some work may be required to decompose requirements to a consistent level so that they may be counted accurately for the appropriate SoS-of-interest. Table 1 contains the complexity definitions for the SoS-related requirements.

2 Adapted to SoS environment from COSYSMO.

Page 25: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 19

Table 1. SoS-Related Requirements Complexity Ratings.

Easy Nominal Difficult

Simple to implement Familiar Complex to implement or engineer

Traceable to source Can be traced to source with some effort

Hard to trace to source

Little requirements overlap Some overlap High degree of requirements overlap

Number of SoS Interface Protocols: The number of distinct net-centric interface protocols to be provided/supported by the SoS framework. Note: This does NOT include interfaces internal to the SoS component systems, but it does include interfaces external to the SoS and between the SoS component systems. Also note that this is not a count of total interfaces, but rather a count of distinct protocols at the SoS level. (In many SoSs, the total number of interfaces may be very dynamic as component systems come and go in the SoS environment. In addition, there may be multiple instances of a given type of component system). Table 2 contains the complexity definitions for the SoS interface protocols.

Table 2. Interface Protocol Complexity Ratings.

Easy Nominal Difficult

Simple protocol Moderately complex protocol Highly complex or new protocol(s)

May already be supported by several SoS component systems

May already be supported by some SoS component systems

Currently supported by few if any SoS component systems

Uncoupled Loosely coupled Highly coupled

Well understood Predictable behavior Not easily predicable

PRA Cost DriversRequirements Understanding2: A parameter that rates the level of understanding of the SoS requirements by all of the SoS stakeholders including the SoS customers and sponsors, SoS PRA team members, component system owners, users, etc. Primary sources of added systems engineering effort are unprecedented capabilities, unfamiliar domains, or capabilities whose requirements are emergent with use. Table 3 defines the various rating values for the requirements understanding cost driver. (Note: These rating definitions are the same as the ones for the SO and I&T requirements understanding cost drivers, but should be evaluated in terms of the level of understanding between the all of the SoS stakeholders with emphasis on the SoS PRA team members.)

Page 26: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 20

Table 3. PRA Requirements Understanding Ratings.

Very low Low Nominal High Very High

Poor: emergent requirements or unprecedented capabilities

Minimal: many undefined areas

Reasonable: some undefined areas

Strong: few undefined areas

Full understanding of requirements, familiar capabilities

Level of Service Requirements2: A parameter that rates the difficulty and criticality of satisfying the ensemble of level of service requirements or Key Performance Parameters (KPPs), such as security, safety, transaction speed, communication latency, interoperability, flexibility/adaptability, and reliability. Table 4 defines the various rating values for the level of service requirements cost driver. (Note: These rating definitions are the same as the ones for the SO and I&T level of service requirements cost drivers, but should be evaluated in terms of their impacts to the PRA activities.)

Table 4. PRA Level of Service Requirements Ratings.

Aspect Very low Low Nominal High Very High

Difficulty Simple; single dominant KPP

Low, some coupling among KPPs

Moderately complex, coupled KPPs

Difficult, coupled KPPs or some conflicts between KPPs maybe requiring tradeoffs

Very complex, tightly coupled KPPs or significant conflicts between KPPs requiring tradeoffs

Criticality Slight inconvenience

Easily recoverable losses

Some loss High financial loss Risk to human life

SoS Stakeholder Team Cohesion2: Represents a multi-attribute parameter which includes leadership, shared vision, diversity of stakeholders, approval cycles, group dynamics, Integrated Product Team (IPT) framework, team dynamics, trust, and amount of change in responsibilities. It further represents the heterogeneity in stakeholder community of the end users, customers, implementers, and development team. Table 5 defines the various rating values for the SoS stakeholder team cohesion cost driver. (Note: These rating definitions are the same as the ones for the SO and I&T team cohesion cost drivers, but should be evaluated in terms of potential impacts to the PRA activities.)

Page 27: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 21

Table 5. SoS Stakeholder Team Cohesion Ratings.

Aspect Very Low Low Nominal High Very High

Culture Stakeholders with diverse expertise, task nature, language, culture, infrastructure

Highly heterogeneous stakeholder communities

Heterogeneous stakeholder community

Some similarities in language and culture

Shared project culture

Strong team cohesion and project culture

Multiple similarities in language and expertise

Virtually homogeneous stakeholder communities

Institutionalized project culture

Compatibility Highly conflicting organizational objectives

Converging organizational objectives

Compatible organizational objectives

Clear roles and responsibilities

Strong mutual advantage to collaboration

Familiarity and trust

Lack of trust Willing to collaborate, little experience

Some familiarity and trust

Extensive successful collaboration

Very high level of familiarity and trust

LSI PRA Team Capability: A parameter that represents the anticipated level of PRA team cooperation and cohesion, personnel capability and continuity, as well as PRA personnel experience with the relevant domains, applications, language, and tools for SoS LSI personnel working on the PRA activities. Table 6 defines the various rating values for the LSI PRA team capability cost driver.

Table 6. LSI PRA Team Capability Ratings.

Aspect Very Low Low Nominal High Very High

PRA Cohesion Highly conflicting organizational objectives

Lack of trust

Converging organizational objectives

Willing to collaborate, little experience

Compatible organizational objectives

Some familiarity and trust

Clear roles and responsibilities

Extensive successful collaboration

Strong mutual advantage to collaboration

Very high level of familiarity and trust

PRA Capability

15th percentile 35th percentile 55th percentile 75th percentile 90th percentile

Specific SoS PRA-Relevant

Experience

Less than 2 months

1 year continuous experience, other technical experience in similar job

3 years of continuous experience

5 years of continuous experience

10 years of continuous experience

Expected Annual PRA Turnover

48% 24% 12% 6% 3%

Page 28: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 22

LSI PRA Process Maturity: A parameter that rates the maturity level and completeness of the LSI’s PRA processes and plans. Table 7 defines the various rating values for the LSI PRA process maturity cost driver.

Table 7. LSI PRA Process Maturity Ratings.

Aspect Very low Low Nominal High Very High Extra High

PRA Assessment Rating (Capability or Maturity)

Level 0 (if continuous model)

Level 1 Level 2 Level 3 Level 4 Level 5

PRA Team Behavioral Characteristics

Ad Hoc approach to process performance

Performed PRA process, activities driven only by immediate contractual or customer requirements, PRA focus limited

Managed PRA process, activities driven by customer and stakeholder needs in a suitable manner, PRA focus is a project-centric approach –not driven by organizational processes

Defined PRA process, activities driven by benefit to project, PRA focus process approach driven by organizational processes tailored for the project

Quantitatively Managed PRA process, activities driven by PRA benefit

Optimizing PRA process, continuous improvement, activities driven by system engineering and organizational benefit

PRA Tool support2: A parameter that rates the coverage, integration, and maturity of the PRA tools in the SoS engineering and management environments. Table 8 defines the various rating values for the PRA tool support cost driver.

Table 8. PRA Tool Support Ratings.

Very low Low Nominal High Very High

No PRA tools Simple PRA tools, little integration

Basic PRA tools moderately integrated throughout the systems engineering process

Strong, mature PRA tools, moderately integrated with other disciplines

Strong, mature proactive use of PRA tools integrated with process, model-based SE and management systems

PRA Cost/Schedule Compatibility: The extent of business or political pressures to reduce the cost and schedule associated with the PRA activities and processes. Table 9 defines the various rating values for the PRA cost/schedule compatibility cost driver. (Note: These rating definitions are the same as the ones for the SO and I&T cost/schedule compatibility cost drivers, but should be evaluated in terms of potential impacts to the PRA activities.)

Page 29: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 23

Table 9. PRA Cost/Schedule Compatibility Ratings.

Very low Low Nominal High

Estimates are greater than 100% of budgeted cost and schedule

Estimates are between 50% and 100% greater than the budgeted cost and schedule

Estimates are between 20% and 50% greater than the budgeted cost and schedule

Budgeted cost and schedule are within 20% of estimates

SoS PRA Risk Resolution: A multi-attribute parameter that represents the number of major SoS PRA risk items, the maturity of the associated risk management and mitigation plan, compatibility of schedules and budgets, expert availability, tool support, and level of uncertainty in SoS PRA risk areas. Table 10 defines the various rating values for the SoS PRA risk resolution cost driver.

Table 10. SoS PRA Risk Resolution Ratings.

Aspect Very low Low Nominal High Very High

Number and criticality of PRA risk items

> 10 critical 5-10 critical 2-4 critical 1 critical <10 non-critical

PRA risk mitigation activities None Little Some Risks generally covered

Risks fully covered

Schedule, budget, and internal milestones compatible with PRA Risk Management Plan

None Little Some Generally Mostly

% of top system engineers and integrators available to support PRA activities

20% 40% 60% 80% 100%

Tool support available for tracking PRA issues

None Little Some Good Strong

Level of uncertainty in PRA risk areas

Extreme Significant Considerable Some Little

6.3. COSOSIMO SO ParametersThe LSI SO activities are those associated with the identification of potential component system suppliers or vendors, the development of Requests for Proposals (RFPs) and statements of work for candidate suppliers/vendors, the evaluation of supplier/vendor responses, the selection of suppliers/vendors, and then the on-going oversight of supplier/vendor performance through delivery and validation/verification of the desired component system.

Page 30: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 24

SO Size DriversNumber of Independent Component System Organizations: The number of organizations managed by the LSI that are providing SoS component systems. Table 11 contains the complexity definitions for component system organizations.

Table A-11. Component System Organization Ratings.

Easy Nominal Difficult

Vendor/supplier has previously worked closely with LSI and/or many of the other vendors/ suppliers working on the SoS of interest

Vendor/supplier has previously worked to some extent with the LSI and/or some of the other vendors/suppliers working on the SoS of interest

Vendor/supplier has not worked with the LSI in any significant way

No current competition between vendor/supplier and the LSI or other vendors/suppliers on the system of interest

No current significant competition between vendor/ supplier and the LSI or other vendors/suppliers on the system of interest

May be competing with the LSI or other SoS vendors/suppliers for significant related work

Number of Unique Component Systems: The number of types of component systems that are planned to operate within the SoS framework. If there are multiple versions of a given type that have different interfaces, then the different versions should also be included in the count of component systems. Table 12 contains the complexity definitions for component systems.

Table 12. Component System Ratings.

Easy Nominal Difficult

Component system is a relatively open system, with many external interfaces compatible with the SoS architecture

Component system is somewhat open and has some compatible external interfaces, but will require additional interfaces key to SoS operations

Component system is currently a closed, stove-pipe system that has few or no external interfaces compatible with the SoS architecture

Or component system is only in the planning stages or under initial development

SO Cost DriversRequirements Understanding2: A parameter that rates the level of understanding of the SoS requirements between the LSI and the component system suppliers/vendors. Primary sources of added systems engineering effort are unprecedented capabilities, unfamiliar domains, or capabilities whose requirements are emergent with use. Table 13 defines the various rating values for the SO requirements understanding cost driver. (Note: These rating definitions are the same as the ones for the PRA and I&T requirements understanding cost drivers, but should be evaluated in terms of the level of understanding between the LSI and the suppliers/vendors.)

Page 31: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 25

Table 13. SO Requirements Understanding Ratings.

Very low Low Nominal High Very High

Poor: emergent requirements or unprecedented capabilities

Minimal: many undefined areas

Reasonable: some undefined areas

Strong: few undefined areas

Full understanding of requirements, familiar capabilities

Architecture Maturity: A parameter that represents the level of maturity of the SoS architecture. It includes the level of detail of the interface protocols and the level of understanding of the performance of the protocols in the SoS framework. Table 14 defines the various rating values for the SoS architecture maturity cost driver. (Note: These rating definitions are the same as the ones for the I&T architecture maturity cost driver, but should be evaluated in terms of potential impacts to the SO activities.)

Table 14. SO Architecture Maturity Ratings.

Very low Low Nominal High Very High

Incomplete architecture specification, especially in unprecedented areas

Relatively complete architecture specification, but unprecedented areas at a high level of specification

Relatively complete architecture specification

Complete architecture specification, moderately detailed level of specification

Complete architecture specification, relatively detailed specification in high risk areas

Many “To Be Determined (TBD)” elements in the architecture/interface specifications, especially in unprecedented areas

Some TBD elements in the architecture/ interface specifications

Few TBD elements in the architecture/ interface specifications

No TBD elements in the architecture/ interface specifications

No TBD elements in the architecture/ interface specifications

No feasibility analyses or prototypes developed for high risk areas

Little understanding about expected SoS scalability and performance

Few feasibility analyses or prototypes developed for high risk areas

Scalability and performance aspects not investigated significantly

Some feasibility analyses or prototypes developed for high risk areas

Scalability and performance aspects understood to a limited extent

Feasibility analyses and prototypes developed for many unprecedented areas

Most scalability and performance aspects understood reasonably well

Feasibility analyses and prototypes developed for most/all unprecedented areas

Scalability and performance aspects well understood

Level of Service Requirements2: A parameter that rates the difficulty and criticality of satisfying the ensemble of level of service requirements or KPPs, such as security, safety, transaction speed, communication latency, interoperability, flexibility/adaptability, and reliability. Table 15 defines the various rating values for the SoS level of service requirements cost driver. (Note: These rating definitions are the same as the ones for the PRA and I&T level of service requirements cost drivers, but should be evaluated in terms of potential conflicts or incompatibilities between component systems and the SoS-level requirements.)

Page 32: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 26

Table 15. SO Level of Service Requirements Ratings.

Aspect Very low Low Nominal High Very High

Difficulty Simple; single dominant KPP

Low, some coupling among KPPs

Moderately complex, coupled KPPs

Difficult, coupled KPPs or some conflicts between KPPs maybe requiring tradeoffs

Very complex, tightly coupled KPPs or significant conflicts between KPPs requiring tradeoffs

Criticality Slight inconvenience

Easily recoverable losses

Some loss High financial loss

Risk to human life

LSI/Supplier Team Cohesion2: Represents a multi-attribute parameter which includes leadership, shared vision, diversity of stakeholders, approval cycles, group dynamics, IPT framework, team dynamics, trust, and amount of change in responsibilities. It further represents the heterogeneity in stakeholder community of the end users, customers, implementers, and development team. Table 16 defines the various rating values for the LSI/supplier team cohesion cost driver. (Note: These rating definitions are the same as the ones for the PRA and I&T team cohesion cost drivers, but should be evaluated in terms of potential impacts to the SO activities.)

Table 16. LSI/Supplier Team Cohesion Ratings.

Aspect Very Low Low Nominal High Very High

Culture Stakeholders with diverse expertise, task nature, language, culture, infrastructure

Highly heterogeneous stakeholder communities

Heterogeneous stakeholder community

Some similarities in language and culture

Shared project culture

Strong team cohesion and project culture

Multiple similarities in language and expertise

Virtually homogeneous stakeholder communities

Institutionalized project culture

Compatibility Highly conflicting organizational objectives

Converging organizational objectives

Compatible organizational objectives

Clear roles and responsibilities

Strong mutual advantage to collaboration

Familiarity and trust

Lack of trust Willing to collaborate, little experience

Some familiarity and trust

Extensive successful collaboration

Very high level of familiarity and trust

LSI SO Team Capability: Represents the anticipated level of SO team cooperation and cohesion, personnel capability and continuity, as well as SO personnel experience with the relevant domains, applications, language, integration tools, and integration platform(s) used by the various suppliers/vendors. Table 17 defines the various rating values for the LSI SO team capability cost driver.

Page 33: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 27

Table 17. LSI SO Team Capability Ratings.

Aspect Very Low Low Nominal High Very High

SO Cohesion Highly conflicting organizational objectives

Lack of trust

Converging organizational objectives

Willing to collaborate, little experience

Compatible organizational objectives

Some familiarity and trust

Clear roles and responsibilities

Extensive successful collaboration

Strong mutual advantage to collaboration

Very high level of familiarity and trust

SO Capability 15th

percentile35th percentile 55th percentile 75th percentile 90th percentile

Specific SoS SO-Relevant Experience

Less than 2 months

1 year continuous experience, other technical experience in similar job

3 years of continuous experience

5 years of continuous experience

10 years of continuous experience

Expected Annual SO Staff Turnover

48% 24% 12% 6% 3%

LSI SO Process Maturity: A parameter that rates the maturity level and completeness of the LSI’s SO processes and plans. Table 18 defines the various rating values for the LSI SO process maturity cost driver.

Table 18. LSI SO Process Maturity Ratings.

Aspect Very low Low Nominal High Very High Extra High

SO Assessment Rating (Capability or Maturity)

Level 0 (if continuous model)

Level 1 Level 2 Level 3 Level 4 Level 5

SO Team Behavioral Characteristics

Ad Hoc approach to process performance

Performed SO process, activities driven only by immediate contractual or customer requirements, SO focus limited

Managed SO process, activities driven by customer and stakeholder needs in a suitable manner, SOfocus is a project-centric approach –not driven by organizational processes

Defined SO process, activities driven by benefit to project, SO process approach driven by organizational processes tailored for the project

Quantitatively Managed SO process, activities driven by SO benefit

Optimizing SO process, continuous improvement, activities driven by system engineering and organizational benefit

SO Tool Support2: A parameter that rates the coverage, integration, and maturity of SO tools in the SoS engineering and management environment. Table 19 defines the various rating values for SO tool support cost driver.

Page 34: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 28

Table 19. SO Tool Support Ratings.

Very low Low Nominal High Very High

No SO tools Simple SO tools, little integration

Basic SO tools moderately integrated throughout the systems engineering process

Strong, mature SO tools, moderately integrated with other disciplines

Strong, mature proactive use of SO tools integrated with process and management systems

SO Process Cost/Schedule Compatibility: The extent of business or political pressures to reduce cost and schedule. Table 20 defines the various rating values for the SO process cost/schedule compatibility cost driver. (Note: These rating definitions are the same as the ones for the PRA and I&T cost/schedule compatibility cost drivers, but should be evaluated in terms of potential impacts to the SO activities.)

Table 20. SO Process Cost/Schedule Compatibility Ratings.

Very low Low Nominal High

Estimates are greater than 100% of budgeted cost and schedule

Estimates are between 50% and 100% greater than the budgeted cost and schedule

Estimates are between 20% and 50% greater than the budgeted cost and schedule

Budgeted cost and schedule are within 20% of estimates

SoS SO Risk Resolution: A multi-attribute parameter that represents the number of major SoS SO risk items, the maturity of the associated risk management and mitigation plans, compatibility of schedules and budgets, expert availability, tool support, and level of uncertainty in SoS SO risk areas. Table 21 defines the various rating values for the SoS SO risk resolution cost driver.

Table 21. SoS SO Risk Resolution Ratings.

Aspect Very low Low Nominal High Very High

Number and criticality of SO risk items

> 10 critical

5-10 critical

2-4 critical 1 critical <10 non-critical

SO risk mitigation activities None Little Some Risks generally covered

Risks fully covered

Schedule, budget, and internal milestones compatible with SO Risk Management Plan

None Little Some Generally Mostly

% of top system engineers and integrators available to support SO activities

20% 40% 60% 80% 100%

Tool support available for tracking SO issues

None Little Some Good Strong

Level of uncertainty in SO risk area Extreme Significant Considerable Some Little

Page 35: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 29

6.4. COSOSIMO I&T ParametersThe LSI I&T activities are those associated with the SoS component system integration and the verification/validation testing at the SoS level. These activities include integration and test planning, set up of the integration and test environments and tools, development of test data and procedures, and the actual execution and tracking of integration and verification/validation tests. (Note: the effort to develop I&T tools and simulators should be estimated using the COCOMO II cost model described in [Boehm et al, 2000].)

I&T Size DriversNumber of SoS Interface Protocols: The number of distinct net-centric interface protocols to be provided/supported by the SoS framework. Note: This does NOT include interfaces internal to the SoS component systems, but it does include interfaces external to the SoS and between the SoS component systems. Also note that this is not a count of total interfaces (in many SoSs, the total number of interfaces may be very dynamic as component systems come and go in the SoS environment—in addition, there may be multiple instances of a given type of component system), but rather a count of distinct protocols at the SoS level. Table 22 contains the complexity definitions for the SoS interface protocols. (Note: This is the same size driver that is part of the PRA sub-model. It is also included here in the I&T section for completeness.)

Table 22. Interface Protocol Complexity Ratings.

Easy Nominal Difficult

Simple protocol Moderately complex protocol Highly complex or new protocol(s)

May already be supported by several SoS component systems

May already be supported by some SoS component systems

Currently supported by few if any SoS component systems

Uncoupled Loosely coupled Highly coupled

Well understood Predictable behavior Not easily predicable

Number of Operational Scenarios2: The number of operational scenarios that an SoS must satisfy. Such scenarios include both the nominal stimulus-response thread plus all of the off-nominal threads resulting from bad or missing data, unavailable processes, network connections, or other exception-handling cases. The number of scenarios can typically be quantified by counting the number of SoS states, modes, and configurations defined in the SoS concept of operations or by counting the number of “sea-level” use cases [Cockburn 2001], including off-nominal extensions, developed as part of the operational architecture. Table 23 contains the complexity definitions for the SoS operational scenarios.

Page 36: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 30

Table 23. Operational Scenario Complexity Ratings.

Easy Nominal Difficult

Well defined Loosely defined Ill defined

Loosely coupled Moderately coupled Tightly coupled or many dependencies/ conflicting requirements

Timelines not an issue Timelines a constraint Tight timelines through scenario network

Few, simple off-nominal threads

Moderate number or complexity of off-nominal threads

Many or very complex off-nominal threads

Number of Unique Component Systems: The number of types of component systems that are planned to operate within the SoS framework. If there are multiple versions of a given type that have different interfaces, then the different versions should also be included in the count of component systems. Table 24 contains the complexity definitions for the SoS component systems.

Table 24. Component Systems Complexity Ratings.

Easy Nominal Difficult

Component system is a relatively open system, with many external interfaces compatible with the SoS architecture

Component system is somewhat open and has some compatible external interfaces, but will require additional interfaces key to SoS operations

Component system is currently a closed, stove-pipe system that has few or no external interfaces compatible with the SoS architecture

Or component system is only in the planning stages or under initial development

I&T Cost DriversRequirements Understanding2: A parameter that rates the level of understanding of the SoS requirements by all of the SoS stakeholders including the SoS customers and sponsors, SoS I&T team members, component system owners, users, etc. Primary sources of added systems engineering effort are unprecedented capabilities, unfamiliar domains, or capabilities whose requirements are emergent with use. Table 25 defines the various rating values for the requirements understanding cost driver. (Note: These rating definitions are the same as the ones for the PRA and SO requirements understanding cost drivers, but should be evaluated in terms of the level of understanding between all of the SoS stakeholders with emphasis on the SoS I&T team members.)

Table 25. I&T Requirements Understanding Ratings.

Very low Low Nominal High Very High

Poor: emergent requirements or unprecedented capabilities

Minimal: many undefined areas

Reasonable: some undefined areas

Strong: few undefined areas

Full understanding of requirements, familiar capabilities

Page 37: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 31

Architecture Maturity: A parameter that represents the level of maturity of the SoS architecture. It includes the level of detail of the interface protocols and the level of understanding of the performance of the protocols in the SoS framework. Table 26 defines the various rating values for the architecture maturity cost driver. (Note: These rating definitions are the same as the ones for the SO architecture maturity cost driver, but should be evaluated in terms of potential impacts to the I&T activities.)

Table 26. I&T Architecture Maturity Ratings.

Very low Low Nominal High Very High

Incomplete architecture specification, especially in unprecedented areas

Relatively complete architecture specification, but unprecedented areas at a high level of specification

Relatively complete architecture specification

Complete architecture specification, moderately detailed level of specification

Complete architecture specification, relatively detailed specification in high risk areas

Many TBD elements in the architecture/ interface specifications, especially in unprecedented areas

Some TBD elements in the architecture/ interface specifications

Few TBD elements in the architecture/ interface specifications

No TBD elements in the architecture/ interface specifications

No TBD elements in the architecture/ interface specifications

No feasibility analyses or prototypes developed for high risk areas

Little understanding about expected SoS scalability and performance

Few feasibility analyses or prototypes developed for high risk areas

Scalability and performance aspects not investigated significantly

Some feasibility analyses or prototypes developed for high risk areas

Scalability and performance aspects understood to a limited extent

Feasibility analyses and prototypes developed for many unprecedented areas

Most scalability and performance aspects understood reasonably well

Feasibility analyses and prototypes developed for most/all unprecedented areas

Scalability and performance aspects well understood

Level of Service Requirements2: A parameter that rates the difficulty and criticality of satisfying the ensemble of level of service requirements or KPPs, such as security, safety, transaction speed, communication latency, interoperability, flexibility/adaptability, and reliability. Table 27 defines the various rating values for the I&T level of service requirements cost driver. (Note: These rating definitions are the same as the ones for the PRA and SO level of service requirements cost drivers, but should be evaluated in terms of their impacts to the I&T activities.)

Table 27. I&T Level of Service Requirements Ratings.

Aspect Very low Low Nominal High Very High

Difficulty Simple; single dominant KPP

Low, some coupling among KPPs

Moderately complex, coupled KPPs

Difficult, coupled KPPs or some conflicts between KPPs maybe requiring tradeoffs

Very complex, tightly coupled KPPs or significant conflicts between KPPs requiring tradeoffs

Criticality Slight inconvenience

Easily recoverable losses

Some loss High financial loss

Risk to human life

Page 38: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 32

I&T Team Cohesion2: Represents a multi-attribute parameter which includes leadership, shared vision, diversity of stakeholders, approval cycles, group dynamics, IPT framework, team dynamics, trust, and amount of change in responsibilities. It further represents the heterogeneity in stakeholder community of the end users, customers, implementers, and development team. Table 28 defines the various rating values for the I&T team cohesion cost driver. (Note: These rating definitions are the same as the ones for the PRA and SO team cohesion cost drivers, but should be evaluated in terms of potential impacts to the I&T activities.)

Table 28. I&T Team Cohesion Ratings.

Aspect Very Low Low Nominal High Very High

Culture Stakeholders with diverse expertise, task nature, language, culture, infrastructure

Highly heterogeneous stakeholder communities

Heterogeneous stakeholder community

Some similarities in language and culture

Shared project culture

Strong team cohesion and project culture

Multiple similarities in language and expertise

Virtually homogeneous stakeholder communities

Institutionalized project culture

Compatibility Highly conflicting organizational objectives

Converging organizational objectives

Compatible organizational objectives

Clear roles and responsibilities

Strong mutual advantage to collaboration

Familiarity and trust

Lack of trust Willing to collaborate, little experience

Some familiarity and trust

Extensive successful collaboration

Very high level of familiarity and trust

SoS I&T Team Capability: Represents the anticipated level of SoS I&T team cooperation and cohesion, personnel capability and continuity, as well as I&T personnel experience with the relevant domains, applications, language, integration tools, and integration platform(s) needed to integrate the SoS system components and test the SoS. Table 29 defines the various rating values for the SoS I&T team capability cost driver.

Page 39: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 33

Table 29. SoS I&T Team Capability Ratings.

Aspect Very Low Low Nominal High Very High

I&T Cohesion Highly conflicting organizational objectives

Lack of trust

Converging organizational objectives

Willing to collaborate, little experience

Compatible organizational objectives

Some familiarity and trust

Clear roles and responsibilities

Extensive successful collaboration

Strong mutual advantage to collaboration

Very high level of familiarity and trust

I&T Capability

15th percentile 35th percentile 55th percentile 75th percentile 90th percentile

Specific SoS I&T-Relevant

Experience

Less than 2 months

1 year continuous experience, other technical experience in similar job

3 years of continuous experience

5 years of continuousexperience

10 years of continuous experience

Expected Annual I&T Turnover

48% 24% 12% 6% 3%

LSI I&T Process Maturity: A parameter that rates the maturity level and completeness of the LSI’s processes and plans, and in particular, those associated with I&T activities and the SoS integration lab. Table 30 defines the various rating values for the LSI I&T process maturity cost driver.

Table 30. LSI I&T Process Maturity Ratings.

Aspect Very low Low Nominal High Very High Extra High

I&T Assessment Rating (Capability or Maturity)

Level 0 (if continuous model)

Level 1 Level 2 Level 3 Level 4 Level 5

I&T Team Behavioral Characteristics

Ad Hoc approach to process performance

Performed I&T process, activities driven only by immediate contractual or customer requirements, I&T focus limited

Managed I&T process, activities driven by customer and stakeholder needs in a suitable manner, I&T focus is a project-centric approach –not driven by organizational processes

Defined I&T process, activities driven by benefit to project, I&T focus is a process approach driven by organizational processes tailored for the project

Quantitatively Managed I&T process, activities driven by I&T benefit

Optimizing I&T process, continuous improvement, activities driven by system engineering and organizational benefit

Page 40: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 34

I&T Tool Support2: A parameter that rates the coverage, integration, and maturity of the tools in the SoS I&T environment. Table 31 defines the various rating values for the I&T tool support cost driver.

Table 31. I&T Tool Support Ratings.

Very low Low Nominal High Very High

No I&T tools Simple I&T tools, little integration

Basic I&T tools moderately integrated throughout the systems engineering process

Strong, mature I&T tools, moderately integrated with other disciplines

Strong, mature proactive use of I&T tools integrated with process, model-based SE and management systems

I&T Process Cost/Schedule Compatibility: The extent of business or political pressures to reduce the cost and schedule associated with the I&T processes and activities. Table 32 defines the various rating values for the I&T process cost/schedule compatibility cost driver. (Note: These rating definitions are the same as the ones for the PRA and SO cost/schedule compatibility cost drivers, but should be evaluated in terms of potential impacts to the I&T activities.)

Table 32. I&T Process Cost/Schedule Compatibility Ratings.

Very low Low Nominal High

Estimates are greater than 100% of budgeted cost and schedule

Estimates are between 50% and 100% greater than the budgeted cost and schedule

Estimates are between 20% and 50% greater than the budgeted cost and schedule

Budgeted cost and schedule are within 20% of estimates

SoS I&T Risk Resolution: A multi-attribute parameter that represents the number of major SoS I&T risk items, the maturity of risk management and mitigation plan, compatibility of schedules and budgets, expert availability, tool support, and level of uncertainty in SoS I&T risk areas. Table 33 defines the various rating values for the SoS I&T risk resolution cost driver.

Page 41: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 35

Table 33. SoS I&T Risk Resolution Ratings.

Aspect Very low Low Nominal High Very High

Number and criticality of I&T risk items

> 10 critical 5-10 critical

2-4 critical 1 critical <10 non-critical

I&T risk mitigation activities None Little Some Risks generally covered

Risks fully covered

Schedule, budget, and internal milestones compatible with I&T Risk Management Plan and integration scope

None Little Some Generally Mostly

% of top system engineers and integrators available to support I&T activities

20% 40% 60% 80% 100%

Tool support available for tracking I&T issues

None Little Some Good Strong

Level of uncertainty in I&T risk areas

Extreme Significant Considerable Some Little

Component System Maturity and Stability: A multi-attribute parameter that indicates the maturity level of the component systems (number of new component systems versus number of component systems currently operational in other environments), overall compatibility of the component systems with each other and the SoS interface protocols, the number of major component system changes being implemented in parallel with the SoS framework changes, and the anticipated change in the component systems during SoS integration activities. Table 34 defines the various rating values for the component system maturity and stability cost driver.

Table 34. Component System Maturity and Stability Ratings.

Aspect Very Low Low Nominal High Very High

% of total number of unique component systems that are expected to be existing/legacy systems (vs. new)

Less than 30% 30-50% 50-75% 75-90% 90-100%

Compatibility of component systems

Less than 30% 30-50% 50-75% 75-90% Greater than 90%

% of major component system changes done in integration release related to SoS capabilities

Less than 60% 60-70% 70-80% 80-95% Greater than 95%

Anticipated average component system change during integration period

Greater than 20% change

10-20% change

5-10% change 2-5% change Less than 2% change

Page 42: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 36

Component System Readiness: A parameter that rates the overall readiness of component systems for integration. The user evaluates level of Verification and Validation (V&V) that has/will be performed prior to integration and the level of subsystem integration activities that will be performed prior to integration into the SoS integration lab. Table 35 defines the various rating values for the component system readiness cost driver.

Table 35. Component System Readiness Ratings.

Very Low Low Nominal High Very High

Minimally V&V’d

No pre-integration

Some V&V

Minimal pre-integration

Moderate V&V

Some pre-integration

Considerable V&V

Moderate pre-integration

Extensive V&V

Considerable pre-integration

Page 43: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 37

7.0 An Initial Stage-wise SoS Cost Estimation Model

In this section, the planning and estimation of SISOS are addressed from a hybrid, evolutionary development viewpoint. It describes the approaches being used by many SISOS teams to plan and develop incremental SISOS capabilities using both agile and plan-driven techniques to accommodate rapid change while continuing to build, validate, and field capabilities (as described in [Boehm, 2006]). It also discusses how cost models are used to support both the short term and long term estimation needs of these programs.

As mentioned earlier and described in more detail in [Boehm, 2006] and [Lane and Boehm, 2006], SISOSs tend to be evolutionary and therefore, detailed, long-term estimates are not typically feasible. What is more typical is that the over-arching architecture can be defined and developed along with the first several increments of the SISOS. SISOS development tends to be more schedule or cost driven, with stakeholders wanting to know what can be done in the next year or two with a given budget, and then decide how they want to evolve the SISOS next. The future increments are often determined by new technology development, some of which is driven by SISOS needs, some of which is developed independent of the SISOS, but has applications within the SISOS. Part of the SISOS evolutionary process is the refresh of existing SISOStechnologies as COTS products and network technologies evolve, and the evaluation and adoption of new technologies as unanticipated technology becomes available.

7.1. Hybrid Development ProcessRecent work in analyzing SISOS organizational structures [Boehm, 2006] shows that many are adapting to the complexity of the SISOS environment by integrating agile teams with more traditional plan-driven (PD) teams and continuous V&V teams. Figure 4 provides an overview of this hybrid process for a single system development.

Figure 4. Overview of Hybrid Process for a Single Increment/Single System [Boehm and Lane, 2006].

Page 44: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 38

The agile teams respond to the changing environment and define short, stable increments for development. The plan-driven teams implement capabilities in accordance with the stable increment definitions. The continuous V&V teams support the integration and test of the plan-driven increments. Figure 5 shows the key drivers for each team in the hybrid process and the flow in information between these teams.

Figure 5. Detailed View of Hybrid Process for Given Increment/Single System [Boehm and Lane, 2006].

Figure 6 shows how the total SISOS development effort can be viewed and estimated with respect to the COCOMO suite of estimation tools. Note that in the absence of a calibrated COSOSIMO model, COSYSMO can be used to estimate the LSI technical effort and non-parametric methods (e.g., percentage of total effort, activity-based costing) can be used to estimate the other LSI program management activities.

The initial up-front architecting of the SISOS system in the SISOS Inception phase, resulting in a Life Cycle Objectives (LCO) review that ensures that there are feasible options for the desired SoS, can be estimated using COSYSMO with parameters selected to best describe the SoS product, process, and LSI team characteristics. Once the total estimate is computed, it must be adjusted to reflect just the Inception effort. (The current COSYSMO model analysis suggests allocating 7% of the total effort for Inception, 16% for Elaboration, 35% for Development, 28% for Test and Evaluation, and 14% for Transition.)

Page 45: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 39

Figure 6. SoS Cost Estimation Across Life Cycle Phases and Increments [Boehm and Lane, 2006].

In the next phase, the SISOS Elaboration phase, the LSI team must identify the specific system components to be integrated into the SISOS, develop RFPs to solicit responses from prospective vendors, select vendors, adjust the architecture to be consistent with the selected vendors, and then conduct a Life Cycle Architecture (LCA) review at the SISOS level to show the feasibility of the selected approach. A key part of this effort is the evaluation of the supplier and vendor proposals, looking at the completeness and feasibility of their approaches, and identifying potential risks or rework that might result from their approaches. This effort must be estimated using a COSOSIMO-like model since many of these activities are not typically part of a more traditional systems engineering effort.

As the supplier and vendor contracts are put in place, work on the first increment begins. The suppliers/vendors begin working to the plans that the LSI teams developed in the Elaboration phase, using their plan-driven teams. This activity begins with LCA reviews at the supplier level to ensure the feasibility of the supplier’s approach and to identify any additional risks to be tracked during the development. During the early SISOS increments, the LSI may also have “supplier teams” that are responsible for developing the SISOS infrastructure. These development efforts are estimated for each system component using a combination of COSYSMO for the component system engineering effort and either a COCOMO II-like cost model or, for more rapid development

Page 46: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 40

processes, a CORADMO-like cost model (a COCOMO II variant for rapid application development) for the associated software development effort.

At the same time the suppliers and vendors are working on Increment 1, the LSI and system supplier agile teams are assessing potential sources of change, rebaselining requirements for the next increment, and negotiating those changes with the suppliers and vendors. In addition, the LSI V&V team is continually monitoring, verifying, and validating Increment 1, resulting in an Initial Operational Capability (IOC) review for Increment 1. These LSI planning, adjustment, and V&V efforts are estimated using a COSOSIMO-like model. The added system supplier rebaselining efforts can be estimated by using requirements volatility parameters.

As one increment is completed, the plan-driven development teams begin work on the subsequent increment, n, that has been defined “just-in-time” by the agile teams. And the agile teams continue their forward-looking work on increment n+1. By putting these pieces together for the known SISOS increments, it is possible to develop a fairly accurate estimate of the total SoS development for the defined increments.

7.2. Estimation of SISOS Development Effort for a Given Iteration

To develop effort estimates for the total SISOS development, one must include estimatesfor the SoSE activities as well as the development activities of all of the suppliers providing functionality for the given increment. Figure 7 shows how the activities of the SoSE team and the increment’s suppliers are coordinated and synchronized.

Figure 7. Combining SoSE and Component Supplier Processes [Boehm and Lane, 2007b].

Page 47: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 41

In order for this development process to be successful, it is important for the SoSE team to work closely with the suppliers, vendors, and strategic partners to understand what SISOS functionality can be realistically provided for the increment being estimated. Key to this process is the ability of the system component suppliers to plan, implement, and provide functionality identified for each increment. This requires realistic estimates and schedules from the suppliers to support the SISOS estimation process. Late “pivots” of functionality from the current SISOS increment to the next can often have significant impacts on integration activities for the current increment as well as the subsequent increment, often causing extensive re-work to integrate and test deferred capabilities.

7.3. Viewing the Hybrid Process in the SISOS EnvironmentFigures 4 and 5 above showed a simple view of the hybrid process that might be used to develop a single software system. Figure 8 shows how this process scales up to the SISOS environment. At the SoS level, early efforts are initiated to define and elaborate the SoS, and as the SoS-level agile team begins to define the early increments, potential suppliers, vendors, and strategic partners are identified to provide the component systems for the initial increments of the SoS.

Figure 8. Using Hybrid Processes for SISOS Development [Boehm and Lane, 2007a].

It is important to understand that most of the component systems to be incorporated into the SoS already exist, have their own set of stakeholders, as well as their own set of evolutionary goals. (The exception is new component systems that need to be developed

Page 48: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 42

for the SoS.) Because these systems have their own purpose and upgrade/enhancement strategies and priorities, the SoSE team must negotiate with these organizations in order to have them implement the SISOS features needed. This negotiation process requires that the SoSE team and the component system owners work to find appropriate component system increments into which the SISOS functionality can be incorporated without significantly compromising other component system priorities. It also means that the SoSE must make adjustments if any of the suppliers cannot meet the SISOS schedule because of either the time required to develop the needed SISOS functionality or higher priority commitments. Synchronization points and interim feasibility assessments must also be coordinated between the SoSE team and the suppliers/vendors/strategic partners to ensure successful delivery of the component systems to the SISOS integration environment in a timely manner. Figure 8 illustrates this negotiation process: Supplier 1 plans to provide SISOS functionality in its increment x, Supplier 2 plans to provide SISOS functionality in its increment y, and these component systems will be provided for version 1 of the SISOS.

7.4. Combining Agile/Plan-Driven Work in the SISOS Effort Estimates

As shown in Figure 8, hybrid processes that combine both agile and plan-driven work can be used at both the SoS level and the supplier level. In order to have effort estimates reflect the use of these hybrid processes, it is important to estimate each aspect separately. To do this, identify the appropriate cost models/techniques to be used to estimate the SISOS increment (e.g., COSYSMO, COCOMO II, or expert judgment/analogy techniques in conjunction with the COSOSIMO conceptual model). Then execute each cost model/technique twice, first using more agile parameter settings, and the second time using more plan-driven settings. Since these are typically different teams, the personnel characteristics should also be adjusted to reflect the capabilities and experience levels of each team. Finally, using the appropriate cost model guidance for the distribution of effort across phases or organizational historical data, merge the two estimates using agile values for agile phases, and plan-driven for plan-driven and V&V phases/activities. Where boundaries between agile and plan-driven are not clear, use expert judgment to determine the appropriate percentages for the given activity/phase.

7.5. Final Comments on Total SISOS Development Costs This technical report has focused primarily on the estimation of effort associated with the various engineering activities required to define, develop, integrate, and test an SoS. However, as a reminder, this is not the total cost of development. Total costs also include cost elements such as personnel travel, development tools and equipment, SoS infrastructure equipment costs (e.g., hardware and software COTS products), SoS COTS licenses and maintenance contracts, development of infrastructure to support coordination and communications between different organizations (e.g., collaborative websites), and SoS-level integration and test facility, hardware, and software costs. [Stutzke, 2006] provides additional guidance for developing total system and SISOS cost estimates.

Page 49: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 43

8.0 Conclusions

This technical report looked at the motivations and approaches for developing SISOS and then provided detailed planning and estimation guidance to help planners develop successful strategies for SISOS development. There are many potential advantages in investing in a system of systems and well as pitfalls that must be addressed in planning the development of these systems. These include avoiding the unacceptable delays in service, conflicting plans, bad decisions, and slow response to fast-moving events involved with current collections of incompatible systems. On the positive side, successful incremental planning strategies enable organizations to see first, understand first, act first, and finish decisively; and rapidly adapt to changing circumstances. However, in assessing the return on investment in a system of systems, one must assess the size of this investment, and these costs are very easy to underestimate.

For organizations such as DoD that must develop high-assurance systems of systems from closely-coupled, often incompatible and independently evolving, often unprecedented systems, the investment costs for SoSE can be extremely high, particularly if inappropriate SoSE strategies are employed. Although not enough data on completed SISOS projects is currently available to calibrate models for estimating these costs, enough is known about the SoSE cost sources and cost drivers to provide a framework for determining the relative cost and risk of developing systems of systems with alternative scopes and development strategies before committing to a particular SISOS scope and SoSE strategy.

In particular, this study has identified three primary areas in which SISOS costs are likely to be higher than the counterpart costs for traditional systems. These are (1) planning, requirements management, and analysis; (2) source selection and supplier oversight; and (3) system of systems integration and testing. In order to help SISOS planners, architects, and managers better identify and assess the magnitude of these added costs, the study has provided cost driver rating scales for determining which sources of cost are most in need of further analysis of candidate SISOS scoping, architecting, and funding decisions. Further research is underway to better determine the relative contributions of the cost drivers, and eventually to calibrate the cost driver parameters to a predictive SISOS cost estimation model.

Page 50: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 44

9.0 References

[Abts, 2004] Abts, C., “Extending the COCOMO II Cost Model to Estimate COTS-Based System Costs,” USC Ph. D. Dissertation, 2004.

[Ackoff, 1971] Ackoff, R., “Towards a System of Systems Concepts”, Management Science, Vol 17, No. 11, Theory Series, pp. 661-671, July 1971.

[ANSI/EIA, 1999] ANSI/EIA, ANSI/EIA-632-1988 Processes for Engineering a System, 1999.

[Berry, 1964] Berry, B., “Cities as Systems within Systems of Cities”, The Regional Science Association Papers, Volume 13, 1964.

[Blanchette, 2005] Blanchette, S., U.S. Army Acquisition – The Program Executive Officer Perspective, Special Report CMU/SEI-2005-SR-002, 2005.

[Boehm, 2006] Boehm, B., “Some Future Trends And Implications for Systems And Software Engineering Processes’, Systems Engineering, Vol. 1, No. 1, pp. 1-19, 2006.

[Boehm and Lane, 2006] Boehm, B. and J. Lane, "21st Century Processes for Acquiring 21st Century Software-Intensive Systems of Systems." CrossTalk: Vol. 19, No. 5, pp.4-9, 2006.

[Boehm and Lane, 2007a] Boehm, B., and J. Lane, The Incremental Commitment Model for System of Systems Development, Systems and Software Technology Conference, 2007.

[Boehm and Lane, 2007b] Boehm, B., and J. Lane, “Using the Incremental Commitment Model to Integrate System Acquisition, Systems Engineering, and Software Engineering”, CrossTalk, Vol. 19, No. 10, pp. 4-9, October 2007.

[Boehm and Turner, 2004] Boehm, B., and R. Turner, Balancing Agility and Discipline, Addison Wesley, 2004.

[Boehm et al, 2000] Boehm, B., Abts, C., Brown, A. W., Chulani, S., Clark, B. K., Horowitz, E., Madachy, R., Reifer, D., Steece, B., Software Cost Estimation with COCOMO II, Prentice Hall, New Jersey, 2000.

[Boehm et al, 2005] Boehm, B., Valerdi, R., Lane, J., and Brown, W., “COCOMO Suite Methodology and Evolution”, CrossTalk, Vol. 18, No. 4, pp. 20-25, April 2005.

[Cockburn, 2001] Cockburn, A., Writing Effective Use Cases, Addison-Wesley, 2001.

[Cost Xpert Group, 2003] Cost Xpert Group, Inc. Cost Xpert 3.3 Users Manual. San Diego, CA: Cost Xpert Group, Inc., 2003.

[DoD, 2006a] DoD, System Acquisition Guide, Version 1.6, 2006.

[DoD, 2006b] DoD, System of Systems Systems Engineering Guide: Considerations for Systems Engineering in a System of Systems Environment, draft version 0.9, 2006.

[Finley, 2006] Finley, J., “Keynote Address”, Proceedings of the 2nd Annual System of Systems Engineering Conference, 2006.

[Galorath, 2001] Galorath, Inc. SEER-SEM Users Manual. El Segundo, CA: Galorath Inc., 2001.

[GAO, 2006} Government Accounting Office (GAO), Report to Congressional Committees: Defense Acquisitions, Assessments of Selected Major Weapon Programs, GAO-06-391, 2006.

Page 51: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 45

[Garber, 2006] Garber, V., “Keynote Presentation”, Proceedings of the 2nd Annual System of Systems Engineering Conference, 2006.

[IUSSCAA, 2006] IUSS-Caesar Alumni Association (IUSSCAA), IUSS History, http://www.iusscaa.org/history.htm accessed on 12/27/2006.

[Krygiel, 1999] Krygiel, A. Behind the Wizard’s Curtain, CCRP Publication Series, July, 1999.

[Lane, 2005a] Lane, J., "System of Systems Lead System Integrators: Where do They Spend Their Time and What Makes Them More/Less Efficient: Background for COSOSIMO", University of Southern California Center for Systems and Software Engineering, USC-CSE-2005-508, 2005.

[Lane, 2005b] Lane, J., “System of Systems (SoS) Processes”, Proceedings of USC CSE Annual Research Review, March 2005.

[Lane, 2006] Lane, J., "COSOSIMO Parameter Definitions", USC-CSE-TR-2006-606. Los Angeles, CA: University of Southern California Center for Systems and Software Engineering, 2006.

[Lane, 2007] Lane, J., “Understanding Differences Between System of Systems Engineering and Traditional Systems Engineering: PhD Qualifying Exam Proposal”, University of Southern California Center for Systems and Software Engineering Technical Report USC-CSSE-2007-704, 2007.

[Lane and Boehm, 2006] Lane, J., and B. Boehm, “Synthesis of Existing Cost Models to Meet System of Systems Needs”, Conference on Systems Engineering Research, 2006.

[Lane and Valerdi, 2005] Lane, J. and Valerdi, R., “Synthesizing SoS Concepts for Use in Cost Estimation”, Proceedings of IEEE Systems, Man, and Cybernetics Conference, 2005.

[Madachy et al, 2006] Madachy, R., Boehm, B., Lane, J., "Assessing Hybrid Incremental Processes for SISOS Development", USC CSSE Technical Report USC-CSSE-2006-623, 2006.

[Maier, 1998] Maier, M., “Architecting Principles for Systems-of-Systems”, Systems Engineering, 1:4, pp 267-284, 1998.

[Meilich, 2006] Meilich, A., “System of Systems Engineering (SoSE) and Architecture Challenges in a Net Centric Environment”, Proceedings of the 2nd Annual System of Systems Engineering Conference, 2006.

[NAVSTAR, 2006] NAVSTAR Global Positioning System Joint Program Office, http://gps.losangeles.af.mil/, accessed on 12/6/2006.

[OSD ATL, 2000] Office of the Under Secretary of Defense for Acquisition Technology and Logistics, Defense Procurement and Acquisition Policy, Vol. 2, 2002, http://www.acq.osd.mil/dpap/contractpricing/vol2chap4.htm

[Pair, 2006] Pair, C., “Keynote Presentation”, Proceedings of the 2nd Annual System of Systems Engineering Conference, 2006.

[Pressman and Wildavsky, 1973] Pressman, J. and A. Wildavsky, Implementation: How Great Expectations in Washington are Dashed in Oakland; Or, Why It’s Amazing that Federal Programs Work at All, This Being a Saga of the Economic Development Administration as Told by Two Sympathetic Observers Who Seek to Build Morals on a Foundation of Ruined Hopes, University of California Press, 1973.

Page 52: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation 46

[PRICE, 2006] PRICE, Program Affordability Management, http://www.pricesystems.comaccessed on 12/30/2006.

[QSM, 2006] QSM, SLIM-Estimate, http://www.qsm.com/slim_estimate.html accessed on 12/30/2006.

[Siel, 2006] Siel, C., “Keynote Presentation”, Proceedings of the 2nd Annual System of Systems Engineering Conference, 2006.

[SoSECE, 2006] Proceedings of the 2nd Annual System of Systems Engineering Conference, Sponsored by System of Systems Engineering Center of Excellence (SoSECE), http://www.sosece.org/, 2006.

[SOFTSTAR, 2006] SOFTSTAR, http://softstarsystems.com accessed on 12/30/2006.

[Stutzke, 2005] Stutzke, R., Estimating Software-Intensive Systems: Projects, Products, and Processes, Addison-Wesley, 2005.

[USAF, 2005] United States Air Force Scientific Advisory Board, Report on System-of-Systems Engineering for Air Force Capability Development; Public Release SAB-TR-05-04, 2005.

[Valerdi, 2005] Valerdi, R., The Constructive Systems Engineering Cost Model (COSYSMO), PhD Dissertation, University of Southern California, May 2005.

[Wang et al, 2007] Wang, G., Wardle, P., and Ankrun, A., “Architecture-Based Drivers for System-of-Systems and Family-of-Systems Cost Estimating”, 17th INCOSE Symposium, 2007.

[Wilson, 2007] Wilson, J., “System of Systems Cost Estimating Solutions”, Proceedings of the 2007 ISPA-SCEA Joint Conference, New Orleans, LA, 12-15 June 2007.

Page 53: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation A-1

Appendix A

Glossary of Terms

agile methods The primary goals of agile methods are rapid value and responsiveness to change [Boehm and Turner, 2004]. The Manifesto for Agile Software Development (http://agilemanifesto.org/) states:

“We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more.”

component system Independently developed and managed systems that can dynamically come and go from the SoS

convergence protocol A protocol that allows multiple systems to interact and has been generally accepted by a community as a de facto standard. [USAF, 2005]

cost driver Cost model parameter used to adjust nominal effort based pre-defined rating scale, usually from “very high” to “very low”

cost estimating relationship A cost estimating relationship (CER) is a technique used to estimate a particular cost or price by using an established relationship with an independent variable. If you can identify an independent variable (driver) that demonstrates a measurable relationship with contract cost or price, you can develop a CER. That CER may be mathematically simple in nature (e.g., a simple ratio) or it may involve a complex equation. [OSD ATL, 2000]

cost model A tool to help one reason about the cost and schedule implications about decision you may need to make [Boehm et al, 2000]

integrated product team Multi-disciplinary team (often from multiple organizations) responsible for the design/development of a system (or SoS) product

key performance parameter Desired system (or SoS) performance attribute, often related to transaction speed, communication latency, security, safety, reliability, etc.

Page 54: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation A-2

plan-driven methods The primary goals of plan-driven methods are predictability, stability, and high assurance. The plans, work products, and verification and validation strategies of plan-driven methods support these goals. [Boehm and Turner, 2004]

size driver A quantitative cost model parameter used to determine the nominal effort associated with the indicated activities

SoS System developed by creating a framework or architecture to integrate new and existing component systems

SoSE Set of engineering activities performed to define the desired SoS-level capabilities, develop the SoS-level architecture, identify sources to either supply or develop the required SoS component systems, then integrate and test these high level components within the SoS architecture framework, with the result being a deployable SoS

Page 55: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation B-1

Appendix B

List of Acronyms

ABCS Army Battlefield Command System

ASW Anti-Submarine Warfare

B/L Baselined

CER Cost Estimating Relationship

COCOMO Constructive Cost Model

CORADMO Constructive Rapid Application Development Model

COSOSIMO Constructive System of Systems Integration Cost Model

COSYSMO Constructive Systems Engineering Cost Model

COTS Commercial Off-the-Shelf

CSSE Center for Systems and Software Engineering

DI Development Increment

DoD Department of Defense

FCS Future Combat System

GPS Global Positioning System

IOC Initial Operational Capability

IPT Integrated Product Team

IUSS Integrated Undersea Surveillance System

I&T Integration and Testing

KPP Key Performance Parameter

LCO Life Cycle Objectives

LSI Lead System Integrator

NIFC-CA Naval Integrated File Control Counter Air

OODA Observe, Orient, Decide, Act

OO&D Observe, Orient, and Decide

PD Plan-Driven

PRA Planning, Requirements Management, and Architecting

RFP Request for Proposal

ROM Rough Order of Magnitude

Page 56: Modern Tools to Support DoD Software - USC

Modern Tools to Support SISOS Cost Estimation B-2

SBCT Stryker Brigade Combat Team

SE Systems Engineering

SIAP Single Integrated Air Picture

SISOS Software-Intensive System of Systems

SO Source Selection and Supplier Oversight

SoS System of Systems

SoSE System of Systems Engineering

SOSUS Sound Surveillance System

TBD To Be Determined

TSE Traditional Systems Engineering

USC University of Southern California

V&V Verification and Validation