list of publications resulted and papers submitted from ... · web viewflexible and intelligent...
TRANSCRIPT
SERC 2015-TR-021-4.4 – Volume 4 1
Flexible and Intelligent Learning Architectures for SoS (FILA-SoS)Volume 4 – Architecture Assessment Model
Technical Report SERC-2015-TR-021-4.4
February 19, 2015
Principal Investigator: Dr. Cihan H. Dagli, Missouri University of Science and Technology
Project Team
Co-PI: Dr. David Enke, Missouri S&T
Co-PI: Dr. Nil Ergin, Penn State University
Co-PI: Dr. Dincer Konur, Missouri S&T
Co-PI: Dr. Ruwen Qin, Missouri S&T
Co-PI: Dr. Abhijit Gosavi, Missouri S&T
Dr. Renzhong Wang
Missouri S&T Graduate Students
Louis Pape II, Siddhartha Agarwal and Ram Deepak Gottapu
Flexible and Intelligent Learning Architectures for SoS (FILA-SoS)
Volume 1: Integrated Model Structure
Volume 2: Meta-Architecture Generation Multi-Level Model
Volume 3: Fuzzy-Genetic Optimization Model
Volume 4: Architecture Assessment Model
Volume 5: Cooperative System Negotiation Model
Volume 6: Non-Cooperative System Negotiation Model
Volume 7: Semi-Cooperative System Negotiation Model
Volume 8: Incentive based Negotiation Model for System of Systems
Volume 9: Model for Building Executable Architecture
Volume 10: Integrated Model Software Architecture and Demonstration FILA-SoS Version1.0
Volume 11: Integrated Model Structure FILA-SoS Version 2.0
Volume 12: Complex Adaptive System-of-System Architecture Evolution Strategy Model for FILA-SoS Version 2.0
Volume 13: On the Flexibility of Systems in System of Systems Architecting: A new Meta-Architecture Generation Model for FILA-SoS Version 2.0
Volume 14: Assessing the Impact on SoS Architecture Different Level of Cooperativeness: A new Model for FILA-SoS Version 2.0
Volume 15: Incentivizing Systems to Participate in SoS and Assess the Impacts of Incentives: A new Model for FILA-SoS Version 2.0
Volume 16: Integrated Model Software Architecture for FILA-SoS Version 2.0
Volume 17: FILA-SoS Version 1.0 Validation with Real Data
SERC 2015-TR-021-4.4 – Volume 4 2
Copyright © 2015 Stevens Institute of Technology, Systems Engineering Research Center
The Systems Engineering Research Center (SERC) is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology. This material is based upon work supported, in whole or in part, by the U.S. Department of Defense through the Office of the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) under Contract Numbers: H98230-08-D-0171 and HQ0034-13-D-004. Any views, opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Department of Defense nor ASD(R&E). No Warranty.This Stevens Institute of Technology and Systems Engineering Research Center material is furnished on an “as-is” basis. Stevens Institute of Technology makes no warranties of any kind, either expressed or implied, as to any matter including, but not limited to, warranty of fitness for purpose or merchantability, exclusivity, or results obtained from use of the material. Stevens Institute of Technology does not make any warranty of any kind with respect to freedom from patent, trademark, or copyright infringement. This material has been approved for public release and unlimited distribution.
SERC 2015-TR-021-4.4 – Volume 4 3
Executive Summary
Multi-faceted systems of the future will entail complex logic and reasoning with many levels of reasoning
in intricate arrangement. The organization of these systems involves a web of connections and
demonstrates self-driven adaptability. They are designed for autonomy and may exhibit emergent
behavior that can be visualized. Our quest continues to handle complexities, design and operate these
systems. The challenge in Complex Adaptive Systems design is to design an organized complexity that
will allow a system to achieve its goals. This report attempts to push the boundaries of research in
complexity, by identifying challenges and opportunities. Complex adaptive system-of-systems (CASoS)
approach is developed to handle this huge uncertainty in socio-technical systems.
Although classically (Dahmann, Rebovich, Lowry, Lane, & Baldwin, 2011) four categories of SoS are
described in literature namely; Directed, Collaborated, Acknowledged and Virtual. However, there exist
infinitely many SoS on the edges of these categories thus making it a continuum. Many SoS with
different configurations can fill this gap. These four types of SoS vary based on their degree of
managerial control over the participating systems and their structural complexity. The spectrum of SoS
ranges from Directed SoS that represents complicated systems to Virutal SoS that are complex systems.
Acknowledged SoS lie in between this spectrum. This particular SoS is the focal point of our research
endeavor. Acknowledged SoS and Directed SoS share some similarities such as both have (Dahman &
Baldwin, 2011) SoS objectives, management, funding and authority. Nevertheless, unlike Directed SoS,
Acknowledged SoS systems are not subordinated to SoS. However, Acknowledged SoS systems retain
their own management, funding and authority in parallel with the SoS. Collaborative SoS are similar to
Acknowledged SoS systems in the fact that systems voluntarily work together to address shared or
common interest.
Flexible and Intelligent Learning Architectures for SoS (FILA-SoS) integrated model is developed in this
research task provides a decision making aid for SoS manager based on the wave model. The model
developed called the FILA-SoS does so using straightforward system definitions methodology and an
efficient analysis framework that supports the exploration and understanding of the key trade-offs and
requirements by a wide range system-of-system stakeholders and decision makers in a short time. FILA-
SoS and the Wave Process address four of the most challenging aspects of system-of-system architecting:
SERC 2015-TR-021-4.4 – Volume 4 4
1. Dealing with the uncertainty and variability of the capabilities and availability of potential
component systems;
2. Providing for the evolution of the system-of-system needs, resources and environment over
time;
3. Accounting for the differing approaches and motivations of the autonomous component
system managers;
4. Optimizing system-of-systems characteristics in an uncertain and dynamic environment with
fixed budget and resources;
Some of the highlights of FILA-SoS are listed in terms of its capabilities, value added to systems
engineering, ability to perform “What-if Analysis”, modularity of integrated models, its potential
applications in the real world and future additions to the current version.
FILA-SoS has a number of unique capabilities such as integrated model for modeling and simulating SoS
systems with evolution for multiple waves. It also has modularity in the structure where the models can
be run independently and in conjunction with each other. Besides there are a couple of different models
for both architecture generation and SoS behavior and various individual system behavior negotiation
models between SoS and individual systems. In terms of value added FILA-SoS aids the SoS manager in
future decision making. It also helps in understanding the emergent behavior of systems in the acquisition
environment and impact on SoS architecture quality. FILA-SoS serves as an artifact to study the dynamic
behavior of different type of systems (non-cooperative, semi-cooperative, cooperative). It enables us to
identify intra and interdependencies among SoS elements and the acquisition environment. FILA-SoS can
provide a “What-if” Analysis depending on variables such as SoS funding and capability priority that can
be changed as the acquisition progresses through wave cycles. It has the ability to simulate any
architecture through colored petri nets. In addition, it can simulate rules of engagement & behavior
settings: all systems are non-cooperative, all systems are semi-cooperative, and all systems are
cooperative or a combination. Some of the potential applications include modeling a wide variety of
complex systems models such as logistics, and cyber-physical systems. It also acts as a test-bed for
decision makers to evaluate operational guidelines and principles for managing various acquisition
environment scenarios. Future Capabilities that are currently in progress are extending the model to
include multiple interface alternatives among systems and incorporation of risk models into
environmental scenarios.
The project reports span 17 volumes. Each report describes the various aspects of the FILA-SOS
integrated model.
SERC 2015-TR-021-4.4 – Volume 4 5
Volume 1 is the Integrated Model Structure report for FILA-SoS Version 1.0. It provides a short
description of all independent models that make up the FILA-SoS integrated model. Integrated FILA-SoS
developed is tested in three notional System-of-Systems namely; Toy Problem for Aircraft Carrier
Performance Assessment, ISR (intelligence surveillance and reconnaissance) and SAR (search and
rescue). FILA-SoS integrated model is currently being validated with a real life data from a medium sized
SoS. The results of this validation are given in volume 17.
Volume 2 describes Meta-Architecture Generation Multi-Level Model. The multi-level meta-architecture
generation model considers constructing an SoS architecture such that each capability is provided by at
least one system in the SoS and the systems in the SoS are able to communicate with each other.
Secondly, it has multiple objectives for generating a set of SoS architectures namely; maximum total
performance, minimum total costs and minimum deadline. Finally, the model establishes initial contracts
with systems to improve performances.
Volume 3 illustrates the second meta-architecture generation model known as the Fuzzy-Genetic
optimization model. This model is based on evolutionary multi-objective optimization for SoS
architecting using genetic algorithms and four key performance attributes (KPA) as the objective
functions. It also has a type-1 fuzzy assessor for dynamic assessment of domain inputs and that forms the
fitness function for the genetic algorithm. It returns the best architecture (meta-architecture) consisting of
systems and their interfaces. It is a generalized method with application to multiple domains such as Gulf
War Intelligence/Surveillance/Reconnaissance Case, Aircraft Carrier Performance Assessment Case and
Alaskan Maritime Search and Rescue Case.
Volume 4 describes an Architecture Assessment Mode that can capture the non-linearity in key
performance attribute (KPA) tradeoffs, is able to accommodate any number of attributes for a selected
SoS capability, and incorporate multiple stakeholder’s understanding of KPA’s. Assessment is based on a
given meta-architecture alternative. This is done using type-1 fuzzy sets and fuzzy inference engine. The
model provides numerical values for meta-architecture quality.
Volumes 5, 6, and 7 describe the systems negotiation models. The negotiation can be based on any
quantitative issue such as amount of funding required, deadline to join SoS and performance level
requests.
Volume 5 specifically describes the Cooperative System Negotiation Model. The systems following this
model behave cooperatively while negotiating with the SoS manager. The model of cooperative behavior
is based on agent preferences and the negotiation length. Each system agent has two inherent behaviors of
SERC 2015-TR-021-4.4 – Volume 4 6
cooperativeness: Purposive (normal behavior) and Contingent (behavior driven by unforeseen
circumstances). The approach models the tradeoff between the two behaviors for the systems. A fuzzy
weighted average approach is used to arrive at the final proposed value.
Volume 6 goes on to describe the Non-Cooperative System Negotiation Model in which systems behave
in their self-interest while negotiating with the SoS coordinator. A mathematical model of individual
system’s participation capability and self-interest negotiation behavior is created. This methodology is an
optimization-based generator of alternatives for strategically negotiating multiple items with multiple
criteria. Besides, a conflict evaluation function that estimates prospective outcome for identified
alternative is proposed.
The third and last system negotiation model is described in Volume 7, which illustrates the Semi-
Cooperative System Negotiation Model. It exhibits the capability of being flexible or opportunistic: i.e.,
extremely cooperative or uncooperative based on different parameter values settings. A Markov-chain
based model designed for handling uncertainty in negotiation modeling in an SoS. A model based on
Markov chains is used for estimating the outputs. The work assigned by the SoS to the system is assumed
to be a ``project’’ that takes a random amount of time and a random amount of resources (funding) to
complete.
Volume 8 explains the SoS negotiation model also called the Incentive Based Negotiation Model for
System of Systems. This model is based on two key assumptions that are to design a contract to convince
the individual systems to join the SoS development and motivate individual systems to do their tasks well.
Game theory and incentive based contracts are used in the negotiation model that will maximize the
welfare for parties involved in the negotiation. SoS utility function takes into account local objectives for
the individual systems as well as global SoS objective whereas the incentive contract design persuades
uncooperative systems to join the SoS development.
Volume 9 illustrates the process of building Executable Architectures for SoS. The operations of the SoS
is a dynamic process with participating system interacting with each other and exchange various kinds of
resources, which can be abstract information or physical objects. This is done through a hybrid structure
of OPM (Object process methodology) and CPN (Colored petri nets) modeling languages. The OPM
model is intuitive and easy to understand. However, it does not support simulation, which is required for
accessing the behavior related performance. This is achieved by mapping OPM to CPN, which is an
executable simulation language. The proposed method can model the interactions between components of
a system or subsystems in SoS. In addition, it can capture the dynamic aspect of the SoS and simulate the
behavior of the SoS. Finally, it can access various behavior related performance of the SoS and access
SERC 2015-TR-021-4.4 – Volume 4 7
different constitutions or configurations of the SoS which cannot be incorporated into the meta-
architecture generation models of Volume 2 & 3.
Volume 10 elucidates the Integrated Model Software Architecture and Demonstration based on the
models described above. Volume 11 and thereon the reports are aimed at the upcoming newer version 2.0
of FILA-SoS. Volume 11 provides Integrated Model Structure for FILA-SoS Version 2.0 that could be
implemented in a new software environment.
Volume 12 provides a model to answer the first research question “What is the impact of different
constituent system perspectives regarding participating in the SoS on the overall mission effectiveness of
the SoS?”. It is named the Complex Adaptive System-of-System Architecture Evolution Strategy Model
and is incorporated in FILA-SoS Version 2.0. This volume describes a computational intelligence based
strategy involving meta-architecture generation through evolutionary algorithms, meta-architecture
assessment through type-2 fuzzy nets and finally its implementation through an adaptive negotiation
strategy.
Volumes 13 and 14 provide two different approaches to answer the second research question “How do
differing levels of cooperativeness in participating in the SoS impact the ability and timeliness of a group
to agree on a SoS or system architecture? Or impact the ability to effectively use the architecture
already in place?”.
Volume 13 is termed the Flexibility of Systems in System of Systems Architecting: A new Meta-
Architecture Generation Model for FILA-SoS Version 2.0. The research question is answered through an
alternative technique to meta-architecture generation besides the one described in Volume 2.
Volume 14 proposes a new method for Assessing the Impact on SoS Architecture Different Level of
Cooperativeness. Second research question is answered through a model that allows different levels of
cooperativeness of individual systems.
Volume 15 is an extension of previous systems negotiation models based on incentivizing and is aptly
called Incentivizing Systems to Participate in SoS and Assess the Impacts of Incentives: A new Model for
FILA-SoS Version 2.0. It also provides an approach to answer the third research question “How should
decision-makers incentivize systems to participate in SoS, and better understand the impact of these
incentives during SoS development and effectiveness?”. This model is based on the fact that providing
incentives only depending on the outcome may not be enough to attract the attention of the constituent
systems to participate in SoS mission. Therefore, this model extends the approach as described in Volume
8 while considering the uncertainty in the acquisition environment. The incentive contract is designed
SERC 2015-TR-021-4.4 – Volume 4 8
based on the objectives of the SoS and the individual systems. Individual system’s objective is to secure
highest incentives with minimal effort while the SoS manager’s goal is to convince individual systems to
join the SoS development while maximizing its own utility.
Volume 16 gives an overview of the integrated model architecture in version 2.0 of the software. It
includes all old and new models previously mentioned.
Finally, Volume 17 describes the validation of the FILA-SoS Version 1.0 with a real life data provided by
MITRE Corporation by from a moderately sized SoS.
SERC 2015-TR-021-4.4 – Volume 4 9
List of Publications Resulted and Papers Submitted from FILA-SoS Research
1. Wang, R., Agarwal,S., & Dagli, C. (2014). Executable System of Systems Architecture
Using OPM in Conjunction with Colored Petri Net: A Module for Flexible Intelligent &
Learning Architectures for System of Systems, In Europe Middle East & Africa Systems
Engineering Conference (EMEASEC).
2. Ergin, N. K.,(2014), Improving Collaboration in Search and Rescue System of
Systems, Procedia Computer Science, Volume 36, Pages 13-20.
3. Agarwal, S., & Dagli, C. H. (2013). Augmented Cognition in Human–System
Interaction through Coupled Action of Body Sensor Network and Agent Based
Modeling. Procedia Computer Science, 16, 20-28.
4. Acheson, P., Dagli, C., & Kilicay-Ergin, N. (2013). Model Based Systems Engineering
for System of Systems Using Agent-based Modeling. Procedia Computer Science, 16,
11-19.
5. Agarwal, S., Pape, L. E., & Dagli, C. H. (2014). A Hybrid Genetic Algorithm and
Particle Swarm Optimization with Type-2 Fuzzy Sets for Generating Systems of
Systems Architectures. Procedia Computer Science, 36, 57-64.
6. Agarwal, S., Pape, L. E., Kilicay-Ergin, N., & Dagli, C. H. (2014). Multi-agent Based
Architecture for Acknowledged System of Systems. Procedia Computer Science, 28, 1-
10.
7. Agarwal, S., Saferpour, H. R., & Dagli, C. H. (2014). Adaptive Learning Model for
Predicting Negotiation Behaviors through Hybrid K-means Clustering, Linear Vector
Quantization and 2-Tuple Fuzzy Linguistic Model. Procedia Computer Science, 36,
285-292.
8. Agarwal,S., Wang, R., & Dagli, C., (2015) FILA-SoS, Executable Architectures using
Cuckoo Search Optimization coupled with OPM and CPN-A module: A new Meta-
Architecture Model for FILA-SoS, France, Complex Systems Design & Management
(CSD&M) editor, Boulanger, Frédéric, Krob, Daniel, Morel, Gérard, Roussel, Jean-
Claude, P 175-192 . Springer International Publishing.
SERC 2015-TR-021-4.4 – Volume 4 10
9. Pape, L., Agarwal, S., Giammarco, K., & Dagli, C. (2014). Fuzzy Optimization of
Acknowledged System of Systems Meta-architectures for Agent based Modeling of
Development. Procedia Computer Science, 28, 404-411.
10. Pape, L., & Dagli, C. (2013). Assessing robustness in systems of systems meta-
architectures. Procedia Computer Science, 20, 262-269.
11. Pape, L., Giammarco, K., Colombi, J., Dagli, C., Kilicay-Ergin, N., & Rebovich, G.
(2013). A fuzzy evaluation method for system of systems meta-architectures. Procedia
Computer Science, 16, 245-254.
12. Acheson, P., Dagli, C., & Kilicay-Ergin, N. (2013). Model Based Systems
Engineering for System of Systems Using Agent-based Modeling. Procedia
Computer Science, 16, 11-19.
13. Acheson, P., Dagli, C., & Kilicay-Ergin, N. (2014). Fuzzy Decision Analysis in
Negotiation between the System of Systems Agent and the System Agent in an
Agent-Based Model. arXiv preprint arXiv:1402.0029.
14. Kilicay-Ergin, N. H., Acheson, P., Colombi, J. M., & Dagli, C. H. (2012). Modeling
system of systems acquisition. In SoSE (pp. 514-518).
15. Acheson, P., Pape, L., Dagli, C., Kilicay-Ergin, N., Columbi, J., & Haris, K. (2012).
Understanding System of Systems Development Using an Agent-Based Wave Model.
Procedia Computer Science, 12, 21-30.
16. Konur, D., & Dagli, C. (2014). Military system of systems architecting with individual
system contracts. Optimization Letters, 1-19.
17. Agarwal et al., 2015 Flexible and Intelligent Learning Architectures for SoS (FILA-
SoS): Architectural evolution in Systems-of-Systems, 2015 Conference on Systems
Engineering Research.
18. Ergin, D., & Dagli, C., Incentive Based Negotiation Model for System of Systems
Acquisition. ( Accepted by Systems Engineering Journal)
19. Wang, R., & Dagli, C., Search Based Systems Architecture Development Using
Holistic Approach (Accepted to IEEE Systems Journal with minor revisions)
SERC 2015-TR-021-4.4 – Volume 4 11
Table of ContentsList of Publications Resulted and Papers Submitted from FILA-SoS Research...........................................10
1 Introduction..............................................................................................................................15
1.1 Motivation for research.............................................................................................................15
1.2 System of System Challenges.....................................................................................................18
1.3 How does FILA-SoS address SoS pain points..............................................................................21
2 Overview of the FILA-SoS integrated model..............................................................................25
2.1 Definition of Variables for SoS...................................................................................................25
2.2 Independent modules of FILA-SOS............................................................................................29
3 Architecture Assessment...........................................................................................................32
3.1 Acknowledged SoS.....................................................................................................................32
3.2 Proposed Modeling Approach...................................................................................................34
3.3 SoS Attributes............................................................................................................................36
4 Proposed Method for Developing an Sos Evaluation Model.....................................................39
4.1 Use Case Model of the Domain Independent Method..............................................................39
4.2 Domain Independent Model Development..............................................................................42
4.2.1 Establishing A Vision of The SoS.........................................................................................43
4.2.2 Understanding Stakeholders Views...................................................................................49
4.2.3 Review of the Method Steps..............................................................................................54
4.2.4 Architecture Space Exploration..........................................................................................57
4.3 Individual Systems’ Information................................................................................................58
4.3.1 Cost, Performance and Schedule Inputs of Component Systems......................................58
4.3.2 Membership Functions......................................................................................................59
4.3.3 Mapping Attribute Measures to Fuzzy Variables...............................................................60
4.3.4 Non-Linear Trades in Multiple Objectives of SoS...............................................................62
4.4 Combining SoS Attribute Values Into an Overall SoS Measure..................................................63
4.5 Exploring the SoS Architecture Space with Genetic Algorithms.................................................66
4.6 Combining the Fuzzy Approach with the GA Approach.............................................................67
5 Concluding Remarks..................................................................................................................69
Cited and Related References...................................................................................................................70
SERC 2015-TR-021-4.4 – Volume 4 12
List of Figures
Figure 1 Schematic drawing of four classical types of SoS based on Degree of Control and Degree of Complexity.................................................................................................................................................16Figure 2 ISR System-of-Systems for testing FILA-SoS.................................................................................23Figure 3 SAR System-of-Systems for testing FILA-SoS................................................................................23Figure 4 Assessing the Performance of Aircraft Carrier for testing FILA-SoS.............................................24Figure 5 The Wave Model of SoS initiation, engineering, and evolution...................................................26Figure 6 Integrated modules within FILA- SoS...........................................................................................29Figure 11. SoS evaluation method determines the fitness of each architecture, or (system + interface) SoS arrangement, from the meta-architecture and domain dependent information...............................35Figure 14. Use case diagram for developing a SoS Architecture; dashed lines are for information; solid lines are ‘responsible for,’ or active involvement; this portion of the effort excludes the NegotiateAchievableSoS use case.............................................................................................................40Figure 15. Domain Independent Process Method for SoS Model building...............................................43Figure 16. Sample OV-1 for Ballistic Missile Defense (ASN/RDA, 2009)....................................................46Figure 17. Domain independent method to gather domain dependent data for SoS architecture model development and analysis, with output of a ‘good’ architecture at the lower right.................................49Figure 18. Given an Evaluation Model that depends on a chromosome from the meta-architecture, the genetic algorithm can optimize within a multidimensional space of attributes........................................55Figure 19. Matlab Fuzzy Toolbox printout of membership function shapes used in this analysis............60Figure 20. Map from fuzzy variable on horizontal axis to probability of detection on left.......................61Figure 21. Attribute values, mapped to fuzzy variables............................................................................62Figure 23. Example of nonlinear SoS fitness versus Affordability and Performance, based on membership function shapes and combining rules...................................................................................63Figure 24. Exploring the meta-architecture - 25 chromosomes, 22 systems, Example 1..........................67Figure 25. Exploring the meta-architecture to map membership function edges, Example 2..................68Figure 26. Exploring large populations to set the membership function edges........................................68
SERC 2015-TR-021-4.4 – Volume 4 13
List of Tables
Table 1 System of Systems and Enterprise Architecture Activity...............................................................18Table 2 List of SoS and component systems’ variable meanings within the meta-architecture................40Table 3 Example SoS evaluation model building questionnaire for creating an AV-1................................44Table 4 Example AV-2, Integrated Dictionary...........................................................................................53Table 5 Example of a few powerful Fuzzy Inference Rules for combining attribute values.......................64
SERC 2015-TR-021-4.4 – Volume 4 14
1 Introduction
1.1 Motivation for research
In the real world, systems are complex, non-deterministic, evolving, and have human centric capabilities.
The connections of all complex systems are non-linear, globally distributed, and evolve both in space and
in time. Because of non-linear properties, system connections create an emergent behavior. It is
imperative to develop an approach to deal with such complex large-scale systems. The approach and goal
is not to try and control the system, but design the system such that it controls and adapts itself to the
environment quickly, robustly, and dynamically. These complex entities include both socioeconomic and
physical systems, which undergo dynamic and rapid changes. Some of the examples include
transportation, health, energy, cyber physical systems, economic institutions and communication
infrastructures.
In addition, the idea of “System-of-Systems” is an emerging and important multidisciplinary area. An
SoS is defined as a set or arrangement of systems that results when independent and useful systems are
integrated into a larger system that delivers unique capabilities greater than the sum of the capabilities of
the constituent parts. Either of the systems alone cannot independently achieve the overall goal. System-
of- Systems (SoS) consists of multiple complex adaptive systems that behave autonomously but
cooperatively (Dahman, Lane, Rebovich, & Baldwin, 2008). The continuous interaction between them
and the interdependencies produces emergent properties that cannot be fully accounted for by the
“normal” systems engineering practices and tools. System of Systems Engineering (SoSE), an emerging
discipline in systems engineering is attempting to form an original methodology for SoS problems
(Luzeaux, 2013).
Since SoS grow in complexity and scale with the passage of time it requires architectures that will be
necessary for understanding and governance and for proper management and control. Systems
architecting can be defined as specifying the structure and behavior of an envisioned system. Classical
system architecting deals with static systems whereas the processes of System of Systems (SoS)
architecting has to be first done at a meta-level. The architecture achieved at a meta-level is known as the
meta-architecture. The meta-architecture sets the tone of the architectural focus (Malan & Bredemeyer,
2001). It narrows the scope of the fairly large domain space and boundary. Although the architecture is
still not fixed but meta-architecture provides multiple alternatives for the final architecture. Thus
architecting can be referred to as filtering the meta-architectures to finally arrive at the architecture. The
SERC 2015-TR-021-4.4 – Volume 4 15
SoS architecting involves multiple systems architectures to be integrated to produce an overall large scale
system meta-architecture for a specifically designated mission (Dagli & Ergin, 2008). SoS achieves the
required goal by introducing collaboration between existing system capabilities that are required in
creating a larger capability based on the meta-architecture selected for SoS. The level of the degree of
influence on individual systems architecture through the guidance of SoS manager in implementing SoS
meta-architecture can be classified as directed, acknowledged, collaborative and virtual. Acknowledged
SoS have documented objectives, an elected manager and defined resources for the SoS. Nonetheless, the
constituent systems retain their independent ownership, objectives, capital, development, and sustainment
approaches. Acknowledged SoS shares some similarities with directed SoS and collaborative SoS. There
are four types of SoS that are described below:
Figure 1 Schematic drawing of four classical types of SoS based on Degree of Control and Degree of Complexity
– Virtual
• Virtual SoS lack a central management authority and a centrally agreed upon purpose for the system-of-systems.
• Large-scale behavior emerges—and may be desirable—but this type of SoS must rely upon relatively invisible mechanisms to maintain it.
– Collaborative
SERC 2015-TR-021-4.4 – Volume 4 16
• In collaborative SoS the component systems interact more or less voluntarily to fulfill agreed upon central purposes.
– Acknowledged (FILA-SoS integrated model is based on Acknowledged SoS)
• Acknowledged SoS have recognized objectives, a designated manager, and resources for the SoS; however, the constituent systems retain their independent ownership, objectives, funding, and development and sustainment approaches.
• Changes in the systems are based on collaboration between the SoS and the system.
– Directed
• Directed SoS’s are those in which the integrated system-of-systems is built and managed to fulfill specific purposes.
• It is centrally managed during long-term operation to continue to fulfill those purposes as well as any new ones the system owners might wish to address.
• The component systems maintain an ability to operate independently, but their normal operational mode is subordinated to the central managed purpose.
This research is based on Acknowledged SoS. The major objectives of the reasearch are:
– To develop a simulation for Acknowledged SoS architecture selection and evolution.
– To have a structured, repeatable approach for planning and modeling.
– To study and evaluate the impact of individual system behavior on SoS capability and
architecture evolution process.
The dynamic planning for a SoS is a challenging endeavor. Department of Defense (DoD) programs
constantly face challenges to incorporate new systems and upgrade existing systems over a period of time
under threats, constrained budget, and uncertainty. It is therefore necessary for the DoD to be able to look
at the future scenarios and critically assess the impact of technology and stakeholder changes. The DoD
currently is looking for options that signify affordable acquisition selections and lessen the cycle time for
early acquisition and new technology addition. FILA-SoS provides a decision aid in answering some of
the questions.
This volume gives an overview of a novel methodology known as the Flexible Intelligent & Learning
Architectures in System-of-Systems (FILA-SoS). Some the challenges that are prevalent in SoS
architecting and how FILA-SoS attempts to address them is explained in the next section.
SERC 2015-TR-021-4.4 – Volume 4 17
1.2 System of System Challenges
All these recent developments are helping us to understand Complex Adaptive Systems. They are at the
edge of chaos as they maintain dynamic stability through constant self-adjustment and evolution. Chaos
and order are two complementary states of our world. A dynamic balance exists between these two states.
Order and structure are vital to life. Order ensures consistency and predictability and makes the creation
of systems possible. However, too much order leads to rigidity and suppresses creativity. Chaos
constantly changes the environment creating disorder and instability but can also lead to emergent
behavior and allows novelty and creativity. Thus, sufficient order is necessary for a system to maintain an
ongoing identity, along with enough chaos to ensure growth and development. The challenge in Complex
Adaptive Systems design is to design an organized complexity that will allow a system to achieve its
goals. SoS is a complex systems by its nature due to the following characteristics that are component
systems are operationally independent elements and also managerially independent of each other. This
means that component systems preserve existing operations independent of the SoS. SoS has an
evolutionary development and due to the large scale complex structure shows an emergent behavior.
Emergence means the SoS performs functions that do not reside in any one component system.
2012 INCOSE SoS working group survey identified seven ‘pain points’ raising a set of questions for
systems engineering of SoS which are listed in Table 1 (Dahman, 2012).
Table 1 System of Systems and Enterprise Architecture Activity
Pain Points Question
Lack of SoS Authorities &
Funding
What are effective collaboration patterns in systems of systems?
Leadership What are the roles and characteristics of effective SoS leadership?
Constituent Systems What are effective approaches to integrating constituent systems into a SoS?
Capabilities & Requirements How can SE address SoS capabilities and requirements?
Autonomy, Interdependencies
& Emergence
How can SE provide methods and tools for addressing the complexities of
SoS interdependencies and emergent behaviors?
Testing, Validation & Learning How can SE approach the challenges of SoS testing, including incremental
SERC 2015-TR-021-4.4 – Volume 4 18
validation and continuous learning in SoS?
SoS Principles What are the key SoS thinking principles, skills and supporting examples?
The importance and impact on systems engineering of each pain point is illustrated below:
i. Lack of SoS Authorities & Funding and Leadership pose several and severe governance and
management issues for SoS. This conditions has a large impact on the ability to implement
systems engineering (SE) in the classical sense to SoS. In addition, this problem affects the
modeling & simulation activities.
ii. Constituent Systems play a very important role in the SoS. As explained earlier usually they have
different interests and ambitions to achieve, which may or may not be aligned with the SoS..
Similarly models, simulations and data for these systems will naturally have to be attuned to the
specific needs of the systems, and may not lend themselves easily to supporting SoS analysis or
engineering
iii. Autonomy, Interdependencies & Emergence is ramifications of the varied behaviors and
interdependencies of the constituent systems making it complex adaptive systems. Emergence
comes naturally in such a state, which is often unpredictable. While modeling & simulation can
aid in representing and measuring these complexities, it is often hard to achieve real life
emergence. This is due to limited understanding of the issues that can bring up serious
consequences during validation.
iv. The overall Capability of the SoS and the individual systems capability needs may be high level
and need definition in order to align them with the requirements of the SoS mission. The SoS
mission is supported by constituent systems, which may not be able (or willing) to address them.
v. Testing, Validation & Learning becomes difficult since the constituent systems continuously keep
evolving, adapting, as does the SoS environment which includes stakeholders, governments, etc.
Therefore creating a practical test-bed for simulating the large dynamic SoS is a challenge in
itself. Again modeling & simulation can solve part of the problem such as enhancing live test and
addressing risk in SoS when testing is not feasible; however, this requires a crystal clear
representation of the SoS which can be difficult as discussed in earlier points.
vi. SoS Principles are still being understood and implemented. Therefore, the rate of success is yet to
be addressed formally. This poses some pressure on the progress of SoS engineering. Similarly,
SERC 2015-TR-021-4.4 – Volume 4 19
there is an absence of a well-established agreeable space of SoS principles to drive development
and knowledge. This constricts the effective use of potentially powerful tools.
The DoD 5000.2 is currently used as the acquisition process for complex systems. Schwartz (2010)
described this process as an extremely complex systemic process that cannot always constantly produce
systems with expected either cost or performance potentials. The acquisition in DoD is an SoS problem
that involves architecting, placement, evolution, sustainment, and discarding of systems obtained from a
supplier or producer. Numerous attempts undertaken to modify and reform the acquisition process have
found this problem difficult to tackle because the models have failed to keep pace with actual operational
scenarios. Dombkins (1996) offered a novel approach to model complex projects as waves. He suggested
that there exists a major difference in managing and modeling traditional projects versus complex
projects. He further illustrated his idea through a wave planning model that exhibits a linear trend on a
time scale; on a spatial scale, it tries to capture the non-linearity and recursiveness of the processes. In
general, the wave model is a developmental approach that is similar to periodic waves. A period, or
multiple periods, can span a strategic planning time. The instances within the periods represent the
process updates. A recently proposed idea (Dahman, Lane, Rebovich, & Baldwin, 2008) that SoS
architecture development for the DoD acquisition process can be anticipated to follow a wave model
process. According to Dahman DoD 5000.2 may not be applicable to the SoS acquisition process.
Acheson (2013) proposed that Acknowledged SoS be modeled with an Object-Oriented Systems
Approach (OOSA). Acheson also proposes that for the development of SoS, the objects should be
expressed in the form of a agent based model.
The environment and the systems are continuously changing. Let there be an initial environment model,
which represents the SoS acquisition environment. As the SoS acquisition progresses through, these
variables are updated by the SoS Acquisition Manager to reflect current acquisition environment. Thus,
the new environment model at a new time has different demands. To fulfill the demands of the mission a
methodology is needed to assess the overall performance of the SoS in this dynamic situation. The
motivation of evolution are the changes in the SoS environment (Chattopadhyay, Ross, & Rhodes, 2008).
The environmental changes consist of:
• SoS Stakeholder Preferences for key performance attributes
• Interoperability conditions between new and legacy systems
• Additional mission responsibilities to be accommodated
• Evolution of individual systems within the SoS
SERC 2015-TR-021-4.4 – Volume 4 20
Evaluation of architectures is another SoS challenge area as it lends itself to a fuzzy approach because the
criteria are frequently non-quantitative, or subjective (Pape & Dagli, 2013), or based on difficult to define
or even unpredictable future conditions, such as “robustness.” Individual attributes may not have a
clearly defined, mathematically precise, linear functional form from worst to best. The goodness of one
attribute may or may not offset the badness of another attribute. Several moderately good attributes
coupled with one very poor attribute may be better than an architecture with all marginally good
attributes, or vice-versa. A fuzzy approach allows many of these considerations to be handled using a
reasonably simple set of rules, as well as having the ability to include non-linear characteristics in the
fitness measure. The simple rule set allows small adjustments to be made to the model to see how
seemingly small changes affect the outcome. The methodology outlined in this research and technical
report falls under a multi-level plug-and-play type of modeling approach to address various aspects of
SoS acquisition environment: SoS architecture evaluation, SoS architecture evolution, and SoS
acquisition process dynamics including behavioral aspects of constituent systems.
1.3 How does FILA-SoS address SoS pain pointsThe first pain point is Lack of SoS Authorities & Funding which begs a question “What are effective
collaboration patterns in systems of systems?”
Since there is lack of SoS Authority but more so persuasion involved in the workings of a SoS, systems
are allowed to negotiate with the SoS manager. Deadline for preparation, funding and performance
required to complete the mission are some of the issues that form the negotiation protocol. Besides
different combination of behavior types assigned to the systems can help us gauge the best effective
collaboration patterns in systems of systems after the end of negotiations.
The leadership issues pose the question, “What are the roles and characteristics of effective SoS
leadership?” This is addressed by incorporating views from multiple stakeholders while assessing the
architecture’s quality. In addition, we maintain that the characteristics are similar to what an
Acknowledged SoS manager would have while distributing funds and resources among systems for a
joint operation. The SoS manager also has the opportunity to form his decision based on most likely
future scenarios, thus imparting him an edge as compared to other models. This will improve the process
of acquisition in terms of overall effectiveness, less cycle time and integrating legacy systems. Overall,
the role of the leadership is presented a guide than someone who would foist his authority.
The third pain point question, “What are effective approaches to integrating constituent systems into a
SoS? is addressed below. A balance has to be maintained during acquisition between amount of resources
used and the degree of control exercised by the SoS manager on the constituent systems. The meta-
SERC 2015-TR-021-4.4 – Volume 4 21
architecture generation is posed as a multi-objective optimization problem to address this pain point. The
constituent systems and the interfaces between them are selected while optimizing the resources such as
operations cost, interfacing cost, performance levels etc. The optimization approach also evaluates the
solutions based on views of multiple stakeholders integrated together using a fuzzy inference engine.
How can SE address capabilities and requirements? is the fourth pain point and is answered in this
paragraph. Organizations that acquire large-scale systems have transformed their attitude to acquisition.
Hence, these organizations now want solutions to provide a set of capabilities, not a single specific
system to meet an exact set of specifications. During the selection process of systems it is ensured that, a
single capability is provided by more than one system. The idea is to choose at least one systems having
unique capability to form the overall capability of the SoS.
The fifth pain point on autonomies, emergence and interdependencies is one of the most important
objectives of this research. This objective can be described as “How can SE provide methods and tools
for addressing the complexities of SoS interdependencies and emergent behaviors?”. Each system has an
autonomous behavior maintained through pre-assigned negotiation behaviors, differ operations cost,
interfacing cost and performance levels while providing the same required capability. The interfacing
among systems is encouraged to have net-centric architecture. The systems communicate to each other
through several communication systems. This ensures proper communication channels. Together the
behavior and net-centricity make it complex systems thus bringing out the emergence needed to address
the mission.
FILA-SoS is an excellent integrated model for addressing the complexities of SoS interdependencies and
emergent behaviors as explained in the above paragraphs.
As for the sixth pain point on testing, validation and learning goes, FILA-SoS has been tested on three
notional examples so far the ISR, Search and Rescue (SAR) and the Toy problem for Aircraft Carrier
Performance Assessment. For ISR (refer to Figure 2) a guiding physical example is taken from history.
During the 1991 Gulf War, Iraqi forces used mobile SCUD missile launchers called Transporter Erector
Launchers (TELS) to strike at Israel and Coalition forces with ballistic missiles. Existing intelligence,
surveillance, and reconnaissance (ISR) assets were inadequate to find the TELs during their vulnerable
setup and knock down time. The “uninhabited and flat” terrain of the western desert was in fact neither
of those things, with numerous Bedouin goat herders and their families, significant traffic, and thousands
of wadis with culverts and bridges to conceal the TELs and obscure their movement.
SERC 2015-TR-021-4.4 – Volume 4 22
Figure 2 ISR System-of-Systems for testing FILA-SoS
A Coast Guard Search and Rescue (SAR) (Figure 3) SoS engineering and development problem is
selected for serving the Alaskan coast. Detailed information about this case study can be found in Dagli et
al (2013). There is increasing use of the Bering Sea and the Arctic by commercial fisheries, oil
exploration and science, which increases the likelihood of occurrence of possible SAR scenarios.
Figure 3 SAR System-of-Systems for testing FILA-SoS
The toy problem for assessing the performance of the aircraft carrier involves multiple systems such as
satellites, uav’s and ground station that support the aircraft carrier to fulfill the mission (refer to Figure 4).
The results have been obtained for multiple waves of the evolution process for all the examples.
SERC 2015-TR-021-4.4 – Volume 4 23
Figure 4 Assessing the Performance of Aircraft Carrier for testing FILA-SoS
These example discussed above clearly show the domain independence of FILA-SoS.
FILA-SoS is a novel method of making sequential decisions over a period for SoS development. The goal
is to apply the integrated model to dynamically evolve SoS architecture and optimize SoS architecture,
design and validate through simulation tools. The integrated model structure can be applied to various
application areas including development of dynamic water treatment SoS architecture, development of
dynamic Air Traffic Management SoS, and development of autonomous ground transport SoS. FILA-
SoS has a number of abilities that make it unique such as:
Aiding the SoS manager in future decision making
To assist in understanding the emergent behavior of systems in the acquisition environment and impact
on SoS architecture quality
To facilitate the learning of dynamic behavior of different type of systems (cooperative, semi-
cooperative , non-cooperative)
Identifying intra and interdependencies among SoS elements and the acquisition environment
Modeling and application to a wide variety of complex systems models such as logistics, cyber-
physical systems and similar systems
Acting as a Test-bed for decision makers to evaluate operational guidelines and principles for
managing various acquisition environment scenarios
Appropriate to model SoS that evolve over a period of time under uncertainties by multiple wave
simulation capability
SERC 2015-TR-021-4.4 – Volume 4 24
2 Overview of the FILA-SoS integrated modelIn this section an overview of FILA-SoS is described. The model developed called the FILA-SoS is using
straightforward system definitions methodology and an efficient analysis framework that supports the
exploration and understanding of the key trade-offs and requirements by a wide range system-of-system
stakeholders and decision makers in a short time. FILA-SoS and the Wave Process address four of the
most challenging aspects of system-of-system architecting:
i. Dealing with the uncertainty and variability of the capabilities and availability of
potential component systems.
ii. Providing for the evolution of the system-of-system needs, resources and environment
over time.
iii. Accounting for the differing approaches and motivations of the autonomous component
system managers.
iv. Optimizing system-of-systems characteristics in an uncertain and dynamic environment
with fixed budget and resources
2.1 Definition of Variables for SoSThis list comprises of the notation for variables used to solve the Acknowledged SoS architectural evolution problem:
C: The overall capability (the overall goal to be achieved by combining sub-capabilities)
c j:j∈J,J= {1, 2,…, M}: Constituent system capabilities required
si: i∈I,I= {1, 2,…, N}: Total number of systems present in the SoS problem
Let A be a N x M−¿ aij where
a ij=1if capability j is possessed by system i
a ij=0otherwise
Pi: Performance of system i for delivering all capabilities ∑j
aij
F i: Funding of system i for delivering all capabilities ∑j
aij
Di: Deadline to participate in this round of mission development for system i
IF ik is the interface between systemsi ¿k s.t. s≠ k , k ∈I
SERC 2015-TR-021-4.4 – Volume 4 25
IC i: The cost for development of interface for system i
OC i: The cost of operations for system i
KPr : r∈R,R= {1, 2,…, Z}: The key performance attributes of the SoS
FA : Funding allocated to SoS Manager
p= {1, 2,…, P}: number of negotiation attributes for bilateral negotiation
tmax: Total round of negotiations possible
t : Current round of negotiation (epochs)
tmax: Total round of negotiations possible
V piSoS(t ): The value of the attribute p for SoS manager at time t for system i
V piS (t): The value of the attribute p for system iowner at time t
TQ : Threshold architecture quality
The model also involves a list of stakeholders such as the Acknowledged SoS manager, system owners/managers, SoS environment etc.
Figure 5 The Wave Model of SoS initiation, engineering, and evolution
FILA-SoS follows the Dahmann’s proposed SoS Wave Model process for architecture development of
the DoD acquisition process as depicted in Figure 5. FILA-SoS addresses the most important challenges
of SoS architecting in regards to dealing with the uncertainty and variability of the capabilities and
availability of potential component systems. The methodology also provides for the evolution of the
SERC 2015-TR-021-4.4 – Volume 4 26
system-of-system needs, resources and environment over time while accounting for the differing
approaches and motivations of the autonomous component system managers. FILA-SoS assumes to have
an uncertain and dynamic environment with fixed budget and resources for architecting SoS. The overall
idea being to select a set of systems and interfaces based on the needs of the architecture in a full cycle
called the wave. Within the wave, there may be many negotiation rounds, which are referred to as epochs.
After each wave, the systems selected during negotiation in the previous wave remain as part of the meta-
architecture whilst new systems are given a chance to replace those left out as a result.
Processes involved in the wave model and their analog in FILA-SoS can be explained through the first
stage of Initializing the SoS. In terms of initializing, wave process requires to understand the SoS
objectives and operational concept (CONOPS), gather information on core systems to support desired
capabilities. This starts with the overarching capability C desired by Acknowledged SoS manager and
defining the c j or sub-capabilities required to produce capability C and FA ,funding allocated to SoS
Manager. These also form the input to the FILA-SoS for the participating systems si. FILA-SoS requires
tmax the number of negotiation cycles, selection of the meta-architecture modelling procedure and system
negotiation models assigned to participating systems.
The second stage is called the Conduct_SoS_Analysis. For the Wave process, it represents starting an
initial SoS baseline architecture for SoS engineering based on SoS requirements space, performance
measures, and relevant planning elements. For FILA-SoS the baseline architecture is called as the meta-
architecture. Meta-architecture is basically picking up the systems siand their respective capabilitiesa ij.
Meta-architecture modelling requires the values for KPt , the key performance attributes of the SoS, Pi
(Performance of system i) , F i (Funding of system i ), and Dideadline to participate in this round of
mission development for system iwhich is assumed to be the total for all capabilities possessed by system
i. The cost for development of a single interface for system i , IC i and OC i the cost of operations for
system i is also needed at this stage of the model. The next step is the Develop/ Evolve SoS. In this case in
terms of the Wave process essential changes in contributing systems in terms of interfaces and
functionality in order to implement the SoS architecture are identified. Within FILA-SoS this signals the
command to send connectivity request to individual systems and starting the negotiation between SoS and
individual systems. This stage requires the number of negotiation attributes P for a bilateral negotiation
between Acknowledged SoS manager and each systemsi selected in the meta-architecture and tmax which
denotes the total round of negotiations possible.
SERC 2015-TR-021-4.4 – Volume 4 27
The next phase is Plan SoS Update in Wave process. In this, phase the architect plans for the next SoS
upgrade cycle based on the changes in external environment, SoS priorities, options and backlogs. There
is an external stimulus from the environment, which affects the SoS architecture. To reflect that in FILA-
SoS determines which systems to include based on the negotiation outcomes and form a new SoS
architecture. Finally, the last stage in Wave process is Implement SoS Architecture which establishes a
new SoS baseline based on SoS level testing and system level implementation. In the FILA-SoS the
negotiated architecture quality is evaluated based on KPr, key performance attributes of the SoS. If the
architecture quality is not up to a predefined quality or TQ the threshold architecture quality the
Acknowledged SoS manager and systemsi selected in the meta-architecture go for renegotiations. Finally
the process moves on to the next acquisition wave. The evolution of SoS should take into account
availability of legacy systems and the new systems willing to join, adapting to changes in mission and
requirement, and sustainability of the overall operation. FILA-SoS also has the proficiency to convert the
meta-architecture into an executable architecture using the Object Process Model (OPM) and Colored
Petri Nets (CPN) for overall functionality and capability of the meta-architecture. These executable
architectures are useful in providing the much-needed information to the SoS coordinator for assessing
the architecture quality and help him in negotiating better.
Some of the highlights of FILA-SoS are described in terms of its capabilities, value added to systems
engineering, ability to perform “What-if Analysis”, modularity of integrated models, its potential
applications in the real world and future additions to the current version. The most important capability of
FILA-SoS is it being an integrated model for modeling and simulating SoS systems with evolution for
multiple waves. Secondly, all models within FILA-SoS can be run independently and in conjunction with
each other. Thirdly, there are two model types that represent SoS behavior and various individual system
behaviors. Finally, it has the capacity to study negotiation dynamics between SoS and individual systems.
The value added by FILA-SoS to systems engineering is it aids the SoS manager in future decision
making, can help in understanding the emergent behavior of systems in the acquisition environment and
its impact on SoS architecture quality. Besides, it has three independent systems behavior models, which
are referred to as cooperative, semi-cooperative and non-cooperative. These behavior models are used to
Study the dynamic behavior of different type of systems while they are negotiating with SoS manager. In
addition, FILA-SoS assists in identifying intra and interdependencies among SoS elements and the
acquisition environment.
FILA-SoS also can facilitate a “What-if” Analysis using variables such as SoS funding and capability
priority that can be changed as the acquisition progresses though wave cycles. The parameter setting for
SERC 2015-TR-021-4.4 – Volume 4 28
all negotiation models can be changed and rules of engagement can be simulated for different
combinations of systems behaviors.
Potential Application of FILA-SoS include complex systems models such as logistics, cyber-physical
systems. In addition, it can act as test-bed for decision makers to evaluate operational guidelines and
principles for managing various acquisition environment scenarios. While the future capabilities that we
would like to be included are extending the model to include multiple interface alternatives among
systems and incorporation of risk models into environmental scenarios
2.2 Independent modules of FILA-SOSThe FILA-SoS has a number of independent modules that are integrated together for meta-architecture
generation, architecture assessment, meta-architecture executable model, and meta-architecture
implementation through negotiation. An overall view is presented in Figure 6.
Figure 6 Integrated modules within FILA- SoS
All the independent models are listed below for reference:
1. Meta-Architecture Generation Model
2. Architecture Assessment Model
3. SoS Negotiation Model
4. System Negotiation Model: Non-Cooperative
5. System Negotiation Model: Cooperative
SERC 2015-TR-021-4.4 – Volume 4 29
6. System Negotiation Model: Semi-Cooperative
7. Executable Architecting Model: OPM & CPN
8. Overall Negotiation Framework
SERC 2015-TR-021-4.4 – Volume 4 30
The first meta-architecture generation method is fuzzy-genetic optimization model (Pape, Agarwal,
Giammarco & Dagli, 2014). This model is based on evolutionary multi-objective optimization for SoS
architecting with many key performance attributes (KPA). It also has a type-1 fuzzy assessor for dynamic
assessment of domain inputs and that forms the fitness function for the genetic algorithm. It returns the
best architecture (meta-architecture) consisting of systems and their interfaces. It is a generalized method
with application to multiple domains such as Gulf War Intelligence/Surveillance/Reconnaissance Case
and Alaskan Maritime Search and Rescue Case.
The second meta-architecture generation model is based on multi-level optimization (Konur & Dagli,
2014). In this model, architecting is done in two rounds: the first being the initiating the SoS by selecting
the systems to be included in the SoS and then improving the SoS’s performance by allocating funds to
participating systems. The model is generic based on multiple attributes such as maximum performance,
minimum cost and minimum deadline. It based on a Stackelberg game theoretical approach between the
SoS architect and the individual systems.
The particle swarm optimization (Agarwal, Pape, & Dagli, 2014) technique for meta-architecture
generation is similar to fuzzy-genetic model. Except for the fact that evolutionary optimization technique
in this case is based on swarm intelligence. In addition, there are some new key performance attributes
used to calculate the architectures quality. Cuckoo search optimization (Agarwal, Wang, & Dagli, 2014)
based meta-architecture is again anew biologically inspired method of optimization. It has been shown
that it in certain cases it performs better than PSO.
The first architecture assessment method is based on type-1 fuzzy logic systems (FLS) (Pape et al., 2013).
The Key Performance Parameters (KPP) chosen are performance, affordability, flexibility, and
robustness. It can capture the viewpoints of multiple stakeholders’. It can also accommodate any number
of KPPs.
Another architecture assessment method is based on type-2 fuzzy modular nets (Agarwal, Pape & Dagli,
2014). The attributes used for evaluation were Performance, Affordability, Developmental Modularity,
Net-Centricity and Operational Robustness. Type-1 fuzzy sets are able to model the ambiguity in the
input and output variables. However, type-1 fuzzy sets are insufficient in characterizing the uncertainty
present in the data. Type-2 fuzzy sets proposed by Zadeh (1975) can model uncertainty and minimize its
effects in FLS (Mendel & John, 2002).
It is not possible to implement such meta-architecture without persuading the systems to participate,
hence to address the issue a negotiation model is proposed based on game theory (Ergin, 2104). It is an
SERC 2015-TR-021-4.4 – Volume 4 31
incentive based negotiation model to increase participation of individual systems into Search and Rescue
SoS. The model provides a strategy for SoS management to determine the appropriate amount of
incentives necessary to persuade individual systems while achieving its own goal. The incentive contract
is designed based on the objectives of the SoS and the individual systems. Individual system’s objective is
to secure highest incentives with minimal effort while the SoS manager’s goal is to convince individual
systems to join the SoS development while maximizing its own utility. Determining the incentives for
individual systems can be formulated as a multi-constraint problem where SoS manager selects a reward
for the individual system such that the reward will maximize SoS manager’s expected utility while
satisfying the constraints of the individual systems
Another negotiation model based on clustering and neural networks is developed (Agarwal, Saferpour &
Dagli, 2014). This model involves adapting the negotiation policy based on individual systems behavior
that is not known to the SoS manager. The behavior is predicted by clustering the difference of multi-
issue offers. Later the clustered data is trained using supervised learning techniques for future prediction.
Individual systems providing required capabilities can use three kinds of negotiation models based on
their negotiation strategies non-cooperative Linear Optimization model, cooperative fuzzy negotiation
model, and Semi-cooperative Markov chain model (Dagli et al., 2013).
Executable architectures are generated using a hybrid of Object Process Methodology (OPM) and
Colored Petri Nets (CPN) (Agarwal, Wang, & Dagli, 2014), (Wang, Agarwal, & Dagli, 2014), and (Wang
& Dagli, 2011). To facilitate analysis of interactions between the participating systems in achieving the
overall SoS capabilities, an executable architecture model is imperative. In this research, a modeling
approach that combines the capabilities of OPM and CPN is proposed. Specifically, OPM is used to
specify the formal system model as it can capture both the structure and behavior aspects of a system in a
single model. CPN supplements OPM by providing simulation and behavior analysis capabilities.
Consequently, a mapping between OPM and CPN is needed. OPM modeling supports both object-
oriented and process-oriented paradigm. CPN supports state-transition-based execution semantics with
discrete-event system simulation capability, which can be used to conduct extensive behavior analyses
and to derive many performance metrics. The following section presents the fuzzy genetic architecture
generation model.
SERC 2015-TR-021-4.4 – Volume 4 32
3 Architecture Assessment
3.1 Acknowledged SoSVery few SoS are under the sort of tight central control typically attributed to military organizations by
nonmilitary members. On the continuum of degree of central control of SoS described in the SE Guide
for SoS, ranging from extremely tight to near anarchy, almost no SoS exist at either of the far ends of the
scale (Director Systems and Software Engineering, OUSD (AT&L), 2008). Most SoS exist near the
center of the scale, as acknowledged SoS; where there is some recognized central authority, but not
complete, centralized control, authority, or budget. Even in the military, authority is broadly delegated.
The definition of an acknowledged SoS is an overlay on existing component systems that have
independent existence outside the SoS. They have their own architectures, missions, stakeholders, and
funding sources (Bergey, 2009). Moreover, successful managers of acknowledged SoS understand that
existing component systems work best if they are perturbed as little as possible to meet the new
requirements necessary to contribute to the SoS. That is, if the component systems’ architectures are left
to the systems engineering and architecture professionals at the systems’ hierarchy level. It is in the best
interest of the SoS manager to coordinate and guide individual systems rather than attempt to issue
commands to them. On one hand, the component (existing, independently managed) systems have no
need to accede to a SoS manager’s requests/demands, nor to officially report through SoS management
staff teams. On the other hand, there may be numerous reasons to cooperate with the SoS manager’s
desired changes. These include the opportunity (or excuse) to break open their architecture to make those
minor adjustments required to join the SoS – this could allow an opportunity to fix some ongoing
problems that do not on their own account justify such ‘breakage.’ Another reason might be to stretch out
the life of the program (and its contractors) with fresh, new tasking, when otherwise the system would be
approaching its end of life, decommissioning and disposal. A system might already be planning to make
changes to its architecture that could easily accommodate the desirable SoS changes, but the new SoS
opportunity could be a bonus source of funds or the basis of further upgrades to make the system even
more relevant in its own domain in the future.
This modeling framework is offered for an acknowledged SoS, where each component system is a fully
functioning, independently funded and managed system (represented by the SPO). A high-level (SoS
level) official envisions an opportunity to achieve a needed, new capability by using combinations of
existing systems in a new way, such that component systems can be left largely unchanged, or
incorporated with relatively minor changes. This acknowledged SoS approach is only useful if it can
achieve the new capability for either or both of the following:
SERC 2015-TR-021-4.4 – Volume 4 33
A reduced cost compared to designing a separate, new “purpose built” system, and/or
A reduced time to field the new capability.
Defense Secretary Rumsfeld famously said “…you go to war with the army you have, not the army you
might want or wish to have at a later time” (Suarez, 2004). Therefore, the concept of this acknowledged
SoS meta-architecture is that the major capabilities are built into the systems already, but small, quick
changes can be made to interfaces to enhance those existing capabilities when used in a cooperative
manner. The proposed architecture is a novel, binary system and interface architecture, which will be
called the meta-architecture throughout this document. It will guide the SoS architecture development
through a wave model evolution in capability over time (Dahmann, Rebovich, Lane, Lowry, & Baldwin,
2011) with incremental improvements after it starts operation.
The new capabilities being sought in the SoS are achieved by combining mostly existing system
capabilities and/or adding new capabilities that arise in conjunction with other systems (i.e., through new
interfaces) (CJCSI 6212.01F, 12 Mar 2012). If simply throwing more systems (with their individual
capabilities) at the problem were sufficient, there would be no need to create the SoS. Therefore, all
successful acknowledged SoS architectures need to invest in the relationships (and interfaces) between
the systems comprising the SoS. Further, improvements in SoS attributes of interest such as performance,
affordability, reliability, etc., must also arise from the interfaces, otherwise there is no advantage over
simply adding individual systems’ capabilities. The nature of the acknowledged SoS implies that the SoS
manager does not have absolute authority to command system participation (nor interoperability
changes), but must “purchase” the component systems’ participation and modifications not merely with
funds but also through persuasion, the strength of the vision of the SoS, quid pro quos, the bully pulpit,
appeals to good sense, and whatever other means are legitimate and effective (Director Systems and
Software Engineering, OUSD (AT&L), 2008). Individual systems remain free to decide not to participate
in the SoS development, although that choice may cost those systems something, too. That cost would
not come from the SoS budget.
Alternately, some of the desired systems may not be available to the SoS during a particular operational
period of need. They may be down for maintenance, assigned to a higher priority mission, or
geographically distant on their day-to-day missions, therefore not able to contribute. Some capabilities
and interfaces may already exist, meaning they are free and fast for development, but they still may have a
significant cost to operate in a fielded SoS. This may be a reason not to ask them to participate. Some
systems may have enough capability that the SoS can tap their spare capability while they pursue their
original tasks, so they are essentially free to operate. Other capabilities may need minor (compared to a
SERC 2015-TR-021-4.4 – Volume 4 34
new start major program) development efforts, either within a system or by developing a new interface
with another system. The performance capabilities of the SoS will generally be greater than the sum of
the capabilities of its parts (Singh & Dagli, 2009). If this were not the case, there would be no need for
the SoS. Changing the way the systems interact with no modifications typically would not improve the
SoS capabilities; it would simply add more systems with their individual capabilities. Systems
engineering in the overall context of the SoS must address all the attributes of disparate groups of
systems, as well as any crucial issues which affect their behavior.
An instance of an acknowledged SoS might be a military command and control SoS that has transitioned
from a tightly knit group of a few systems to an acknowledged SoS that now includes many more
previously independent systems. This could be due to a change in the implementation or the importance
of the missions currently being supported, or of a change of importance and increase in complexities of
potential cross-cutting (new) SoS capabilities (Dahmann, Baldwin, & Rebovich, 2009). Another
acknowledged SoS might be a regional Air Traffic Control (ATC) system that crosses national
boundaries. National ATCs are independent, but find it in their interest to cooperate and interface with
the regional ATC.
One way to develop better tools for predicting performance is to use proposed new tools on a very simple
model, where the results can be calculated independently. Exploring the working of a tool on simple
models can build confidence that the tool does what it is intended to do. Another way to build confidence
is to choose a model that can be extended in a very straightforward manner to more complex situations.
Real SoS may have very complex architectures, but at the most basic level, they may be boiled down to
‘are the systems here or not, and which of them interface with each other.’ (If they do not interface with
each other, they are not a SoS, but simply a collection of systems.). This simple model of the SoS is
really a meta-architecture.
3.2 Proposed Modeling ApproachThe model is a decision making aid for the SoS manager. It does not so much find the best solution to
designing a SoS, as help the manager explore the influence of the various constraints on the shape of a
reasonable solution. The method starts, as shown in Figure 7, from the SoS context and goals, using the
simplified binary meta-architecture including the full range of candidate systems and their interfaces.
Guided interviews uncover the SoS purpose, characteristics of candidate systems, key attributes that
characterize the SoS and methods for measuring the SoS in each of these attributes. The key attributes
generally lend themselves to linguistic characterization and ranges of measures that may be handled
through fuzzy logic. A subset of the characteristic capabilities of the component systems is discovered
SERC 2015-TR-021-4.4 – Volume 4 35
and documented. Estimated costs, schedule and performance goals are established for the systems,
interfaces, and SoS as a whole. When the models are combined in the SoS model, it is ‘end-around’
checked for consistency, and typically needs to be adjusted in trial runs until it is self-consistent. The
completed model can assess any proposed SoS architecture within the meta-architecture for its KPAs, and
provide an overall assessment of the SoS for its ability to satisfy its numerous constraints. The
constraints and their combination may be described in the rules of a fuzzy inference system. Seeing the
results of the characterization of the KPAs, the other inputs, the combinations of systems and buildup of
SoS capabilities from the component systems is the most useful part of the method for the SoS manager
and the stakeholders. Variations of all inputs, assumptions, rules, etc. may be examined to identify the
most influential characteristics of the problem to insure the formulation of the problem and solutions are
proper and helpful.
Figure 7. SoS evaluation method determines the fitness of each architecture, or (system + interface) SoS arrangement, from the meta-architecture and domain dependent information
Figure 7 shows generalized steps for how to derive the set of attributes by which to evaluate the fitness of
a selected arrangement of the systems and their interfaces to provide required capabilities to the SoS.
Attributes desirable in the completed SoS architecture are elicited from stakeholders through linguistic
analysis of guided interviews with stakeholders. Having developed the attributes of interest, the possible
ranges of evaluation in each attribute are separated into an agreeable number of gradations of goodness or
badness (defining the membership functions for fuzzy sets) with some overlap due to ambiguities in
linguistic representation. The relative value of combinations of performances in each attribute is
developed into fuzzy rules through a series of stakeholder hypothetical tradeoff exercises. The multiple
objective optimization (MOO) problem of finding a good architecture to achieve acceptable values of the
several attributes of the SoS may be solved by finding an architecture that maximizes the single fuzzy
SERC 2015-TR-021-4.4 – Volume 4 36
A Fuzzy SoS
Evaluation Model
The SoS Vision and CONOPS
System & Interface Meta-Architecture
Stakeholders, Values, Preferences, Resources
Systems, Capabilities,
Costs, Schedules
Modular Capability Combination Algorithms
Fuzzification of SoS Attributes, Criteria, Rules
SoS fitness evaluation. The method regards the independent variable to be a chromosome with randomly
positioned ones and zeroes in it, and the dependent variable to be the SoS fitness or overall quality.
Exploring the architecture ‘space’ by evaluating a few hundred chromosomes with varying percentages of
ones provides insight into whether a solution is possible. The fuzzy membership function edges and the
rules may need to be adjusted to find a set of tunable parameters that closes on itself, i.e., one that finds
any solution. In this case, a genetic algorithm approach is used to find a near optimal arrangement from
the meta-architecture if one exists. (It is certainly possible to design a problem for which no acceptable
solutions exist.) Combining all these steps into an organized method has not previously been applied to
SoS. Because of the many simplifications in the method, it is not expected to provide final solutions
directly, but to give insight into behaviors of possible real solutions in response to changes in rules,
definitions of capabilities, performance models, membership function shapes, environment, budgets, etc.
that drive aspects of the development and evolution of SoS.
3.3 SoS AttributesSystems engineers call the areas of engineering design that require detailed knowledge and detailed
analysis tools ‘specialty engineering’ areas (INCOSE, 2011). These types of areas may also be called
attributes of SoS. Just as a measure of ‘reliability’ or ‘availability’ may require very detailed analyses at
many levels within a system design, but result in a single overall number to characterize the design in that
specialty area, the attributes of a SoS may require detailed analyses, but result in a single characterizing
number. The attributes or specialty areas are sometimes be called ‘-ilities;’ they are the subject of
continuing, intense research, especially in the area of SoS. Large lists of the attributes, many with several
definitions, are being catalogued and organized in several on-going efforts (Mekdeci, Shah, Ross,
Rhodes, & Hastings, 2014) (Ross, Rhodes, & Hastings, 2008). Just as that single number characterizing
a system in a specialty area may have numerous conditions limiting its applicability, the attribute
measures characterizing the SoS will probably be valid over a limited range of scenarios. To understand
the implications of a particular measure, one needs to know about all those conditions. Simply presenting
that data in an intelligible format is a challenge. Finally, since the specialty engineering areas typically
have well-known algorithms and procedures for evaluating combinations of subsystems that are easily
extended to combinations of systems, this effort attempts to deal with more appropriately SoS specific
attributes. These SoS attributes might be described as the ones which depend more heavily on the SoS
architecture, which is detailed in the chromosome.
A key feature of the attributes of either systems or SoS is that they frequently pull in different directions.
For example, improving speed may reduce range, both key attributes of overall technical performance.
Improving reliability may increase cost, thereby reducing acquisition affordability, but possibly
SERC 2015-TR-021-4.4 – Volume 4 37
increasing operations and maintenance affordability. Numerous other candidate attributes of SoS exert
pulls along different directions in the multi-dimensional design or architecture space. The selected
architecture must satisfy the most unhappy stakeholder enough to avoid a veto. The stakeholders’
concerns are represented in the attributes selected to grade the value of the proposed architectures. The
models used to evaluate the attributes must be fully described and open to stakeholders so they can assure
themselves the competition among architectures is fair. The weighting between attributes must be open
and fair as well.
Pitsko and Verma (Pitsko & Verma, 2012) describe four principles to make a SoS more adaptable. They
spend a large part of their time describing what adaptable means to various stakeholders, that different
stakeholders may continue to have slightly different concepts of what adaptability means, that the
definition is probably dynamic – changing over time, and that this ambiguity likely will apply to many
other SoS attributes. Schreiner and Wirthlin discuss a partial failure to fully model a space detection SoS
architecture, but learned a lot about how to improve the approach the next time they try it (Schreiner &
Wirthlin, 2012). The point is that people are not modeling according to a well-developed theory of SoS
and then reporting on the success or failure: they are still attempting to define the theory.
There are numerous approaches in the literature attempting to describe useful attributes, as well as how to
measure them, to help understand or predict the value of various architectural arrangements. These
include evolvability and modularity almost as complementary attributes (Clune, Mouret, & Lipson,
2013), while Christian breaks evolvability into four components described as extensibility, adaptability,
scalability and generality (Christian III, 2004). Christian introduces the concept of complexity to overlay
on these attributes because a ‘too simple’ system cannot evolve. Kinnunen reviews at least four
definitions of complexity (Kinnunen, 2006) before offering his analysis of one related to the object
process methodology of Dori. Mordecai and Dori extend that model to SoS specifically for
interoperability (Mordecai & Dori, 2013). Fry and DeLaurentis also discussed measuring netcentricity
(interoperability within the SoS), noting the difficulty of pushing the commonly used heuristics too far,
because the Pareto front exists in multiple dimensions (Fry & DeLaurentis, 2011), not just two as
commonly depicted. Ricci et al. discuss designing for evolvability of their SoS in a wave model and
playing it out several cycles in the future, evaluating cost and performance (Ricci, Ross, Rhodes, &
Fitzgerald, 2013). Because SoS are complex, there are many ways to look at them, with no dominant
theory yet. This is why this direction of research is interesting and worth pursuit (Acheson, et al., 2012).
Slightly different definitions for some of the SoS attributes were chosen for this work, especially for
flexibility and robustness. Lafleur used flexibility in the operational context of changing a system after
SERC 2015-TR-021-4.4 – Volume 4 38
deployment (Lafleur, 2012), in a way in a different way than Deb and Gupta’s classic notion of
robustness (Deb & Gupta, Introducing Robustness in Multi-Objective Optimization, 2006) by shifting the
optimum (defined as narrowly better performance), rather than accepting lower performance across a
wider front. Singer used robustness in a different operational context (Singer, 2006), that of losing a node
in a network, rather more like losing a system or an interface from the SoS as described here. Gao et al.
discussed a concept of robustness as the ability to withstand hacker attacks for networks of networks with
varying degrees of interconnectedness (Gao, Buldyrev, Stanley, & Havlin, 2011). The concept of the
flexibility attribute used here is more attuned to giving the SoS manager flexibility during development,
when selecting systems to supply all the desired capabilities. This falls right in line with some of the
thinking of recent discussions of resilience and sensitivity analyses, although they use the terms resiliency
or robustness for it (Smartt & Ferreira, 2012) (Yu, Gulkema, Briaud, & Burnett, 2011) (Jackson & Ferris,
2013). The point is that there are many possible ways to describe the attributes of SoS, depending on
circumstances, organizations, and stakeholders’ preferences. Many of these ways depend directly on the
architecture of the system of interest. This dependency on interconnectedness fits into the framework of
the architecture meta-model used here. If an attribute does not depend on the SoS architecture in any
way, then it will not be useful to help select between potential architectures. It is not necessary that a
useful ranking algorithm be very accurate in its relationship to the measured attribute, only that it be
pretty highly correlated to reality and nearly monotonic in its ranking. That is sufficient to be useful in
this approach.
For purposes of this research effort, the following key attributes for a family of ISR SoS were defined by
a group of subject matter experts (SMEs) during the SERC research task RT-44 (Dagli, et al., 2013):
Performance: Generally, the sum of the performance in required capabilities of the individual
component systems, with a small boost in performance due to increased coordination through
interfaces. This is explained further in section Error: Reference source not found on netcentricity.
Affordability: Roughly the inverse of the sum of the development and operation costs of the
SoS. The performance delta above is applied in a different way to the affordability to change its
shape as a function of the number of interfaces, but also to be somewhat related to superior
performance.
Developmental Flexibility: This is roughly the inverse of the number of sources that the SoS
manager has for each required sub capability. If a required capability is available from only one
component system, then the SoS manager’s flexibility is very small; they must have that system
SERC 2015-TR-021-4.4 – Volume 4 39
as part of the SoS. On the other hand, if each capability is available from multiple systems within
the SoS, the manager has far more developmental flexibility.
Robustness: This is the ability of the SoS to continue to provide performance when any
individual participating system and all its interfaces is removed. Generally, having a very high
performing system as part of your SoS is a good thing; however, if that system is ever absent, the
performance of the SoS may be degraded substantially. Therefore, it may be useful to have the
contributions of the individual system capabilities more widely dispersed, so that the loss of one
system does not represent as great a loss to the SoS (Pape & Dagli, 2013).
4 Proposed Method for Developing an Sos Evaluation Model
4.1 Use Case Model of the Domain Independent MethodThe method for developing an architecture evaluation model of an SoS is the same regardless of domain.
The steps of the method are shown in the use case summary diagram of Figure 8. The SoS manager is a
key player, along with the SoS stakeholders, in forming a vision of the desired, acknowledged SoS
capabilities. Information from potential component systems also contributes to the SoS vision. The
vision of the SoS informs the model facilitator for exploring ways to model the desirable SoS attributes.
This may include what fraction of the system capabilities the SoS will require, defining the meaning of
the attributes and SoS missions in context, and establishing trade space limits to explore within the SoS
meta-architecture. Other inputs include estimated costs for modification and operation of the systems
within the SoS, which ideally would come from system stakeholders or SMEs, but usually start as
estimates from the SoS manager. The modeler works with the model facilitator and various SMEs to
develop attribute evaluation models that depend on the meta-architecture structure. These individual
attribute evaluation models are combined through a fuzzy logic rule based system to assess the overall
SoS. With this assessment tool, sample architectures represented in the meta-model may be evaluated for
relative fitness as an entire SoS.
This fitness assessment tool is precisely what is needed by a GA to sort the better architectures within a
mutating population of trial chromosomes searching out the meta-architecture space. Sensitivity analyses
can be run by the modeling team in consultation with the SoS manager. The consensus SoS design may
then be presented by the SoS manager to the SPO managers for negotiation about any minor changes
required to join the SoS. The documentation developed during the modeling effort is even more
important for SoS explanation than for the legal and regulatory prescriptions of the DoDAF for official
Program of Record (POR) systems, because the SoS is outside the pre-existing design and training of the
SERC 2015-TR-021-4.4 – Volume 4 40
component systems. Results of the negotiations also need to be well documented, because SMEs may
provide additional information to the negotiations, and stakeholders will want to know what capabilities
their systems agree to provide to the SoS.
Figure 8. Use case diagram for developing a SoS Architecture; dashed lines are for information; solid lines are ‘responsible for,’ or active involvement; this portion of the effort excludes the NegotiateAchievableSoS use case
The list of data required, and the variable names used throughout this effort, for the generic SoS model is
shown in Error: Reference source not found. This is a simplified, binary model of the systems’ presence
or absence from the SoS, and the non-directed interfaces between each pair of systems.
Table 2 List of SoS and component systems’ variable meanings within the meta-architecture
Name or description of variable ExpressionName of SoS: sos 1Number of potential systems: m 2Number of types of systems: t 3Names of system types: sys_typi : i ϵ {1,…t} 4Number of component capabilities: n 5
SERC 2015-TR-021-4.4 – Volume 4 41
Name or description of variable ExpressionNames of component capabilities: sys_capi : i ϵ {1,…n} 6Binary meta-architecture upper triangular matrix: Aij : i ϵ {1,…m}, j ϵ {i,…m} 7
Individual systems of the SoS Aij : i ϵ {1,…m}, j =i , also simetimes written as Aii , or simply Ai 8
Feasible interfaceAij : i ϵ {1,…m}, j > i , and Ajk = 1, Aik = 1, Aii =1, Ajj=1, Akk = 1 , where Akk is any communications system 9
SoS main capability: C 10SoS performance in its large capability: PSoS 11
Component capabilities of systems: cij :: i ϵ {1,…n capabilities}, j ϵ {1,…m systems} (binary matrix) 12
Performance of a particular system in its key capability: Pi
Ss : i ϵ {1,…m}, Ss is each system 13Estimated funding to add an interface to an individual system: FIFi
Ss : i ϵ {1,…m}, Ss is each system 14Deadline for developing new interface(s) on a system: Di
Ss : i ϵ {1,…m}, Ss is each system 15Estimated funding for operation of all the participating systems during an SoS operation:
FOPiSs : i ϵ {1,…m}, Ss is each system
16Function describing the advantage of close collaboration within a SoS as a function of participating systems and interfaces:
F (Aii, Aij, j≠i, ) : i ϵ {1,…m}, j ϵ {i,…m}
17Function for combining system capabilities into SoS capability C: C=∑
k
n
∑i
m
A ii cki 18Number of individual attributes the stakeholders want to evaluate the SoS over:
g19
Attribute names to evaluate SoS architectures against (e.g., cost, performance, flexibility):
Attk : k ϵ {1,…g attributes}20
Number of gradations of each Attribute that become Fuzzy Membership Functions (FMF):
hk : k ϵ {1,…g attributes}21
Fuzzy membership function names within each attribute (granulation = a, attribute = b):
FMFab a ϵ {1,…hk gradations}, b ϵ {1,…g attributes} 22
SERC 2015-TR-021-4.4 – Volume 4 42
Name or description of variable Expression
Fuzzy membership function boundaries (cross over points) for each of b SoS attributes:
Boundab a ϵ {1,…h+1}, b ϵ {1,…g}a=1 is lower bound of universe of discourse, a ϵ {2,…h+1} is upper bound of FMF(a-1)b because Matlab can’t handle matrix subscripts of zero 23
Overall SoS performance in an Attribute (∑
k
n
∑i
m
A iicki) * F (Aii, Aij, j≠i, )24
Total cost of developing and using an SoS TC=∑
j
n
∑i
m
Aij FIF iSs +∑
k
n
∑i
m
A ii FOPiSs
25Parameters for controlling the GA:
Mutation RateNumber in PopulationNumber of Generations
DeltaPG 26
Figure 9 shows an alternate view of the method as a process flow with emphasis on the individual steps,
without concern for who performs them.
4.2 Domain Independent Model DevelopmentThe SoS model includes all the information available to it from the sources gathered from the participants
identified in Figure 8, but it still must be cast in terms of the binary participation model of the meta-
architecture.
The first step, regardless of domain, is to identify the reasons for the SoS and the desired capabilities.
The SoS manager, and the facilitator, must always develop some background and vocabulary within the
domain so that meaningful discussions may be held among stakeholders. At this point one can begin to
create domain specific models of development schedules, costs, performance, and other attributes to be
used in evaluating an SoS architecture. The steps of the general method, however, are the same
regardless of the domain of the model as shown in Figure 9. Many modeling approaches in the literature
assume the architecture is already defined. This is similar to SE methods that assume the requirements
are well defined – nice and clean, but neither realistic nor adequate. The DoDAF, Ver. 2.02, to its credit,
begins at the proper place when it describes a domain independent six-step process for how to build an
architecture model for a large DoD system:
1. Determine Intended Use of Architecture
2. Determine Scope of Architecture
3. Determine Data Required to Support Architecture Development
SERC 2015-TR-021-4.4 – Volume 4 43
4. Collect, Organize, Correlate, and Store Architectural Data
5. Conduct Analyses in Support of Architecture Objectives
6. Document Results in Accordance with Decision-Maker Needs (ASD(NII), 2010)
Figure 9. Domain Independent Process Method for SoS Model building
This research extends the DoDAF system oriented model to SoS, adding detail on how to create,
document and use a similar model building process for an SoS. This will form a basis to help designers
and managers choose SoS architectures more wisely in the future.
The DoDAF viewpoints may be extended to the buildup of any SoS (military, civil or commercial) in
nearly exactly the same way it is intended to be used to document the vision, plans, capabilities, and
workings of a complex weapons system.
4.2.1 Establishing A Vision of The SoSA SoS is by definition a group of independently capable systems, collaborating for a greater purpose, in
other words, to deliver a larger capability. Within some range, systems may be present in varying
numbers (or not at all) for a particular application of the SoS. The concept for the SoS must be
articulated, captured, and agreed to among the stakeholders in relation to this variability in participation.
SERC 2015-TR-021-4.4 – Volume 4 44
Some SoS, after being developed, are on stand-by until called on to perform; others may implement a new
capability that is actually operating all the time. The ideal SoS provides an acceptable range of
capabilities over a broad range of compositions. Typically, the SoS manager (or management group)
creates a vision statement to guide development of the concept for the SoS. The vision includes a high
level description of the goals of the SoS, the potential types of participants and their capabilities, and the
mission(s), threat(s), and a description of how the SoS arrangement will improve existing capabilities, or
provide new ones. The architecture model of the SoS captures this vision but also provides the
framework for decomposing the vision to manageable components as well as for building up the SoS out
of legacy, new, or modified systems. The SoS manager must start with information like that shown in
Error: Reference source not found that corresponds to the DoDAF AV-1 Overview and Summary
Information. (Corresponding roughly to Step 1 of the DoDAF 2.02 6-Step Architecture Development
Process.)
Table 3 Example SoS evaluation model building questionnaire for creating an AV-1
Overarching Purpose Of SoS
A DoDAF OV-1 style description is often helpful; text should accompany it
Unique Value Of SoS
What makes it better than simply adding another legacy system
SoS Measures Of Effectiveness
How will you know how good it is?
Issues That Might Limit Effectiveness
Are changes of procedure necessary? Are there legal, regulatory, or bureaucratic impediments to the creation of the SoS?
SoS Features That Might Greatly Increase SoS Effectiveness
Can changes in procedures help? What is the innovation?
Desired Effectiveness
What would be considered really good, what’s adequate, what’s inadequate?
Rom Budget: DevelopmentRom Budget: OperationsDesired ScheduleAttributes Of The SoS/Range Limits For Fuzzy Evaluation
What might be ‘tradeable’ – Suggestions for fuzzy rules, e.g., is extra performance worth more budget? Is extra flexibility worth more? How much? Is lack of flexibility OK? etc.
Capabilities Of Contributing Systems
How do they combine?
Component Legacy Systems
Type/ Category
Capabilities Time to Develop/ Equip
Costs $M- Dev and Ops
Notes (Incompatibilities, Constraints, Characteristics, etc.)
SERC 2015-TR-021-4.4 – Volume 4 45
etc.
4.2.1.1 An Operational View OV-1 High Level Operational ConceptThe type of information that the SoS manager must have for the ‘Vision of the SoS’ is at least one
example of how the SoS would be used (or a list of examples with all their context). The example must
discuss expected participants in a rough picture (whether in graphics or text) of what the SoS will do in
operation. Initial draft of this information may be summarized in one or two pages as shown in Error:
Reference source not found for the All Viewpoint. This may be expanded to the OV-1 Operational
Overview that describes how the system will be used in slightly greater detail. It can be a graphic with
accompanying text showing the overall concept of use of the SoS as shown in Figure 10. Every term
used in the descriptions is defined in the All View 2, the Integrated Dictionary (AV-2). Major component
systems, data or resource exchanges, and effects are depicted iconically to present an overall, high level
impression of how the SoS may be dispatched, controlled, employed, and recovered, for example, as
shown in Figure 10. For a SoS, support is normally presumed to be supplied by the system operators in
their continuing independent missions, unless significant changes are imposed by the SoS configuration.
Major mission segments are shown are shown in the OV-1. The unifying SoS Integrated Dictionary (AV-
2) is started with the OV-1, building outward so that all terms, components, activities, and interfaces are
defined in one place.
Tracking the sources of definitions is more necessary for an SoS than for a system. Differences in usage
of similar terms between component system stakeholders, model developers and operational users should
be flagged in the AV-2 by noting multiple definitions for the same or similar terms within their proper
contexts. This is significantly important in an acknowledged SoS because the nominally independent
component systems may have their own unique acronyms, terms or usages. The OV-1 establishes the
scope of the SoS. A key component of most SoS is mission flexibility – the ability to pivot to different
postures or missions as conditions change. A discussion of the range of likely activities of the SoS should
be included in the textual explanation of the OV-1, or even as multiple graphics for different missions if
that is part of the SoS charter. The OV-1 of a SoS must also include a discussion of priority between the
SoS mission versus the original and continuing missions of the systems, and should also include a
generalized discussion of how deeply the SoS architecture will be allowed to control the component
systems. That is, to what extent major interfaces enabling the SoS need to be controlled by (or at least
communicated to) the SoS manager through the architecture, versus where existing systems may continue
control of their own configurations. (An extension of Step 2 of DoDAF 2.02 to handle the SoS.)
SERC 2015-TR-021-4.4 – Volume 4 46
Figure 10. Sample OV-1 for Ballistic Missile Defense (ASN/RDA, 2009)
4.2.1.2 Collecting Descriptive Domain InformationIdentifying the numerous stakeholders and their concerns, and gathering data about component systems
and their missions are key parts of developing the required domain knowledge to build a SoS. This
process step is the same regardless of domain. The method is domain independent, but the data gathered
is now domain dependent data. An initial rough level of knowledge is needed to allow a facilitator to
make plans for stakeholder interviews. Identification of key discussion points and possible areas of
tradable concepts within the early SoS construct are made at this point. However, until detailed
discussions with the stakeholders are held, one must not jump to conclusions about what is valuable or
tradable, nor even what the SoS framework looks like. Facilitated discussions with the stakeholders must
draw out the following features of the SoS:
• Desired and composable capabilities, with expected or desired levels of performance
• Concepts of operation for the desired new SoS capabilities
• Likely scenarios for the employment of the SoS
• Key performance parameters, with expected or desired levels of performance
• Possible algorithms to combine component capabilities into SoS capabilities
• Shared (as well as conflicting) judgments about potential evaluation criteria for SoS attributes
SERC 2015-TR-021-4.4 – Volume 4 47
• Relative ranking of, and expected values of, attributes of the SoS by groups of stakeholders
• Rough estimates of cost, schedule, and performance deltas for required minor changes to
existing systems to achieve desired SoS interfaces, or performance
• And to get an overall ‘feel’ for how the SoS might work in practice.
An important part of developing a SoS architecture is to define all the component systems’ ownership,
missions, and priorities in case some capabilities must be ‘hijacked’ to support the SoS. Identifying all
affected stakeholders is the second part of the facilitation exercise. In the Pentagon, this is called ‘staffing
a position paper.’ Since SoS normally include systems both from multiple domains, as well as across a
range of stages of their life cycle, affected stakeholder identification requires careful and extensive
coordination. As the stakeholders are identified, they should be placed in a hierarchy of command,
tasking, and funding chains. This network is the basis of the Organizational Relationships Viewpoint
(OV-4), which serves as an excellent template for SoS in any domain, not only military ones. In normal
DoDI 5000.02 system development, this is nominally within one service, and most of the relationships are
obvious. In a SoS, whether military, civil, or commercial, this effort may require special attention and
care to achieve successful coordination across major organizational boundaries (Director Systems and
Software Engineering, OUSD (AT&L), 2008). Major concerns of each stakeholder should be discovered,
recorded, tracked and updated over time, to aid in the coordination of initial tasking as well as changes to
the goals of the SoS over its lifecycle. An ideal place for this information is in the OV-4 part of the
DoDAF. Capability managers (or at least communities of interest) may be defined in the Capability
Taxonomy viewpoint (CV-2). These are cross referenced and mapped in the Capability Dependencies
viewpoint (CV-4) among the component systems and stakeholders. An ideal SoS would have a variety of
ways to achieve each of its required capabilities, perhaps with varying efficiencies. Having only a single
way to achieve a required capability is an exceedingly poor way to design a SoS; due to the independent
nature of the component systems’ missions, there is no guarantee that all possible systems will always be
made available to the SoS. The concerns expressed in terms of the US DoD programs are equally
applicable to any complex set of entrenched bureaucracies, such as companies in supply chains, divisions
of corporations, or elements of intergovernmental enterprises.
The desired capabilities of the SoS, as well as those of the component systems must be carefully defined
and accounted for both as a function of participating numbers of systems but also over time, as the SoS
plans to mature. An ideal architecture should handle not only incremental improvements over time as
capabilities evolve, but also a range of numbers of component systems. This accounts not only for
technological improvements but also for the availability of systems. The number of systems can change
SERC 2015-TR-021-4.4 – Volume 4 48
on any particular day due either to logistic availability or to higher priorities outside the SoS. The
attribute models of the SoS must be developed as functions of these variables. A SysML approach could
allow parametric definition of capabilities and effectiveness to be explicitly built into the model. Other
approaches may require additional math models, which ideally will be based on architectural data from
the SoS model and the participation represented in the meta-architecture model.
Linguistic analysis (‘computing with words’) (Singh & Dagli, "Computing with words" to support multi-
criteria decision-making during conceptual design, 2010) of the stakeholder discussions allows one to
deduce a set of attributes, potential membership function shapes, and rules for combining attribute values
to create an overall SoS fitness evaluation. It may be necessary to iterate definitions of membership
function shapes and rules to get a reasonable set that works together. Working together here means that
the attribute measures do not overlap, nor correlate too well, among themselves (i.e., they are orthogonal,
or nearly so). If they were duplicative, it would tend to give too much weight to a subset of issues,
instead of optimizing over the broadest range of attributes.
Attribute characteristics and desirable ranges identified in the linguistic analysis are combined with fuzzy
evaluation and a set of rules to derive a meta-architecture based, overall fitness value from the
participating systems and interfaces. Level setting and model checking runs may need to be performed to
insure the story is self-consistent. Then the model can be sampled for stakeholders’ validation. These
steps are as shown in Figure 11 and Error: Reference source not found
The following variables are identified for the SoS model development:
SERC 2015-TR-021-4.4 – Volume 4 49
Figure 11. Domain independent method to gather domain dependent data for SoS architecture model development and analysis, with output of a ‘good’ architecture at the lower right
4.2.2 Understanding Stakeholders ViewsA DoD acknowledged SoS is a very large, complex endeavor. SoS by definition create cross-functional
organizations. They bring together functions that may have been built up through separate, large systems
(and their program offices) that were developed over many years for many reasons, and only recently
appear to have the potential to improve the effectiveness of a process or create a new capability by the
joining together of these previously disparate systems. The new capability is highly desired, but not of
overriding importance in the acknowledged SoS. Many stakeholders are inevitably involved in a SoS.
The stakeholders include at least the following recursive classes of interested parties:
Component Systems (System Program Offices (SPOs) in the DoD or management agencies or
corporations, and all the single system stakeholders that they represent)
The SoS Manager or management agency
Payers/funders (typically Congressional Committees, DoD, and Services for military systems, but
also finance offices of other state or federal agencies, or CEOs of corporations)
Congressional committees/watchdog agencies
National or Theater Command Authority for military
SERC 2015-TR-021-4.4 – Volume 4 50
Users/beneficiaries of the SoS
Operators of the SoS
Competitors of the SoS
Enemies/threats/targets of the SoS
Allies of the U.S.
Press/public opinion
A similar list could be made for other types of SoS in the civil or commercial domains. Occasionally
individual stakeholders may be members of several groups simultaneously. Additional stakeholders may
be professional organizations, industry groups, standards organizations, municipalities, rulemaking
agencies, shareholders of corporations, charities, entrenched bureaucracies, unions, non-governmental
organizations, etc. ‘Due diligence’ is the term for doing the work to identify the stakeholders of a
proposed SoS, their degree of influence, and their level of concern about changes to their existing systems
to make the SoS work.
4.2.2.1 Relationships To Established Decompositions: Task Lists, Joint Capability Areas, ISO Standards
When the domain is military, the Universal Joint Task List, the Service specific task lists, and the Joint
Capability Areas provide excellent vocabulary for defining the missions and capabilities required for
military tasks, independent of the systems used to achieve them (Joint Staff, 2010)
( [email protected], 2009). This vocabulary of capabilities and tasks (activities in UML or SysML
style modeling) is aggregated in the SoS AV-2 and model behavior definitions, so that each time that a
term, word or concept is used in the architecture, reports, or performance models, it is consistent and clear
to every stakeholder or participant. Other domains than military typically have manuals, corporate,
industry or government standards, scholarly, or professional guidance documents, or even textbooks to
provide this background of vocabulary and definitions. The ISO-10303 series of standards is another
source of guidance, particularly AP233, Systems Engineering Data Representation. In fact, there are
usually so many possible sources that it is highly advisable to maintain source tracking within the AV-2,
with priorities assigned to each source to prevent confusion when a term is overloaded by multiple
definitions depending on context.
The DoD task lists contain suggested very high level definitions of measures of effectiveness for
evaluating the performance of the capabilities. These are potentially valuable sources for determining
SERC 2015-TR-021-4.4 – Volume 4 51
membership function shapes and edge values. These are typically ‘improved upon’ for specific system
solutions, but they serve as an excellent starting point for drafting evaluation criteria for the SoS,
especially in performance. For the first pass through a fuzzy analysis, the membership function shapes
are not too important; triangular or trapezoidal shapes work well enough to get started. At the
preliminary stage of analyzing choices with crude, initial models, it is more important to get the
terminology, ordering, and trade space rules agreed to among the stakeholders than to have highly
accurate membership function shapes.
Other ‘-ilities’ models may contribute to SoS attributes – reliability, availability, affordability,
survivability, flexibility, adaptability, agility, ability to be redirected, autonomy, precision, among many
others (Mekdeci, Shah, Ross, Rhodes, & Hastings, 2014), may be useful in evaluating characteristics of a
particular SoS meta-architecture. SoS attributes should be created through reasonable extrapolations of
the component systems’ capabilities to each area, with a small improvement factor for self-coordination.
(If the SoS has no advantage over the simple sum of component systems’ capabilities, then there is no
need for the SoS – simply send more systems to do the task.) By the time one has defined the vision,
capabilities, stakeholders, components, and measures of effectiveness, there should be enough of a basis
to decide what additional data will be required to develop the architecture evaluation models. (Step 3 of
DoDAF 2.02.)
4.2.2.2 Capability Improvement of A Proposed SoSThe concept in the FILA-SoS for the buildup and even emergence of capabilities within the SoS is that
capabilities are brought to the SoS basically intact by the component systems as currently existing.
Typically, the SoS improves the sum of the individual component system capabilities by a change in the
way they work together to provide some unique or even greatly improved capability. Assume the
interfacing of those systems together in a new way can be made to improve performance by a small
multiplier for each connection. This is a typical approach introduced as the concept of netcentricity by
Alberts, Garska and Stein in the late 1990s (Alberts, Garstka, & Stein, 1999). This is equivalent to the
small delta in performance for each used-feasible interface. It is a greatly simplified notion to regard the
performance improvement to be a simple exponential function of the number of interfaces; there is
undoubtedly a plateau effect on the lower end whereby a minimum number of systems must be interfaced
to be able to see the effect. On the high end there is no doubt also a limit to improvement by the
introduction of the concepts of information overload, latency, and bandwidth limits. The simplifying
assumption that more interfaces is better is nevertheless quite reasonable over a broad range between the
two extremes, especially since it is limited to a small fraction for each interface.
SERC 2015-TR-021-4.4 – Volume 4 52
4.2.2.3 Decomposition of Capabilities To Functions And Logical Views The high level capabilities described in the AV-2 and OV-1 can be decomposed to lower level actions
and/or functions allocable to the potential component systems. This continues iteratively, exactly as in
normal/standard systems engineering, until both a functional hierarchy and behavioral description can be
attributed to component systems. Some systems may need upgrades to be compatible with the SoS
architecture. The phasing and organization of the capabilities must be agreed to by both the systems and
the SoS manager, with performance, funding and schedules. The time phasing of capabilities
development is shown in the Capability Phasing Viewpoint (CV-3). If some systems’ capabilities were to
be ready before others and they could be used together, the timeframes would be noted and this would
become a Capability Vision Viewpoint (CV-1) that shows how the deployed capability is built up
(ASN/RDA, 2009). Mapping capability development to operational activities is shown with the
Capability to Operational Activities Mapping Viewpoint (CV-6). If some activities are not possible
without the developing capabilities, then there will be some operational changes over time, as well. Some
functions may be logically grouped because they can be reused to support other missions; some might be
grouped because they are unique to the SoS mission and configuration. Training and tactics may have to
be developed to use new capabilities, or even to get the systems to work together operationally if the
systems don’t already do so in existing, joint missions. These constraints may be shown in several of the
capability views, but especially in the Capability Dependencies (CV-4) and Capability to Organizational
Development (CV-5) viewpoints.
The decomposition of capabilities to functions, and the aggregation of functions to higher levels of
abstraction, eventually to capabilities, are inverse processes. Sometimes it is easier to decompose
downward, other times it is easier to aggregate upward. This depends on what information is available
when one starts the process. The important point is to fill in the Capability Taxonomy Viewpoint (CV-2),
so that it is complete and makes sense to all stakeholders (or at least is accepted by all) as the operative
definition for the SoS. The capability taxonomy is a subset of the Integrated Dictionary definitions, with
the addition of the item’s location within the hierarchy. Naturally, it is best to think through the
implications of the definitions for the whole lifecycle of the SoS. This also implies that the vision should
be sufficient to sustain a lifecycle view for the SoS, not merely the initial use of it. In practice, this
sufficiency of vision is rare.
Many SoS, in spite of being complicated arrangements, are also started as quick reaction responses to
environmental changes. Therefore, many SoS are short time frame exercises. “Make the required
changes quickly, and get them deployed!” is the prevailing attitude in this case. In this extreme, there is
scant thought given to planned upgrades, phased deployment, or building for growth. Here, all the
SERC 2015-TR-021-4.4 – Volume 4 53
changes or new interfaces need to be developed in one time period (usually a budget period, called an
epoch in FILA-SoS). When there is planning for development over several epochs, the ‘in-work’
interfaces are regarded as not participating until their delivery epoch.
The development of the Architecture Views must be an ongoing, continually updated process extending
throughout the program life cycle. The simplest cocktail napkin draft to the most detailed, data base
driven, multiply approved, fully vetted, graphical interface control definition should be documented
within a “model,” as the single source of data. Many of the remaining DoDAF viewpoints can be derived
from basic Operational Activity Model Viewpoints (OV-5b) activity diagrams if they contain both
activities allocated to swim lanes (denoting the various participants/elements/actors) and sequenced data
flows between elements. This is an addition to the basic (minimalist) definition of an activity diagram,
but adding these two items is an important step in defining how the SoS will operate. One can vary the
amount of back up text residing in each object in the model. This is dependent on the amount of detail
required and available at each stage of the architecture development. However, the AV-2 works most
brilliantly if two conditions are fulfilled: all participants assiduously define their terms in it, and a
facilitator continually edits its contents for clarity and consistency. Consistency is sustained if the rest of
the documentation uses the AV-2.
An architecture of the SoS will exist, whether or not it is defined, planned, or understood. It will be a far
more useful architecture (and a better SoS) if the architecture is developed intentionally, and well
documented. That the documentation might be in a standard, organized framework, and maintained
throughout the life of the SoS in a central repository, could make it useful to new hires, visitors, and the
engineers and managers attempting to upgrade, maintain, or use the SoS in the future. (Step 4 and 6 of
DoDAF 2.02.)
The Integrated Dictionary (AV-2) is the authoritative source for definitions of all elements of the
architecture or program descriptions. All acronyms, terms of art, and important concepts must be defined
there, and the source of the definition is maintained to give context for understanding arcane, duplicative,
or cross program usages (frequent occurrences in SoS). Example architectural element definitions are
shown in Error: Reference source not found.
SERC 2015-TR-021-4.4 – Volume 4 54
Table 4 Example AV-2, Integrated Dictionary
Phrase Acronym Definition Source
Computing Infrastructure Readiness
CIR Provides the necessary computing infrastructure and related services to allow the DoD to operate according to net-centric principles. It ensures that adequate processing, storage, and related infrastructure services are in place to respond dynamically to computing needs and to balance loads across the infrastructure.
DoD IEA v2.0
Concept of Operations A clear and concise statement of the line of action chosen by a commander in order to accomplish his mission.
Std I/F UCS Nato STANAG 4586-3
Conceptual Data Model
DIV-1 The required high-level data concepts and their relationships.
DoDAF 2.02
Condition The state of an environment or situation in which a Performer performs.
DoDAF 2.02
Confidentiality Assurance that information is not disclosed to unauthorized entities or processes.
DoD IEA v2.0
Configuration A characteristic of a system element, or project artifact, describing their maturity or performance.
INCOSE Sys Eng Hndbk v3.2.1
4.2.2.4 Conducting Analyses of SoS BehaviorThe SoS manager and development facilitator must at this point be doing some mental estimation of
where the required SoS capabilities could be obtained and for what cost. They must be designing
questions to elicit both responses and thought from the stakeholders about what could be of value in
building the SoS. The stakeholders extend both up and down the chain of responsibilities with the SoS
manager in the middle. Are there potential multiple sources for most required capabilities? Are there
new ways of putting pieces together in different ways to accomplish necessary tasks or functions? Do
new technologies allow for anything more easily that previously envisioned? What if something did work
in a new way, how much better would it be? What functional relationships could be described to evaluate
the SoS? What ranges of values of performance would be outstanding, pretty good, acceptable, poor, or
awful? Answering these questions will allow models to be built that will allow new designs of a SoS to
SERC 2015-TR-021-4.4 – Volume 4 55
be evaluated. (Step 5 of DoDAF 2.02. Step 6 is documenting the viewpoints in a self-consistent model,
which is done during all the previous steps.)
4.2.3 Review of the Method StepsYet another way to look at this model development process is shown in Figure 12, using the binary
participation meta-architecture model as a starting point. A vision of the SoS, facilitated stakeholder
discussions, produces a plethora of linguistic terms and definitions. Linguistic analysis of these
discussions may be used to distill the SoS attributes that are important to the stakeholders. Linguistic
analysis also may be used to establish ranges of values for the attributes that are considered excellent,
good, or very bad, as well as the strength of the stakeholders, feelings about each of these ranges. The
modeler gets to play a role at this point by writing trial algorithms that work on the meta-architecture to
deliver an initial trial measure of each attribute. These measures should depend significantly on the meta-
architecture, because they are used to evaluate the goodness of the architecture of the SoS. Given this
information, establishing membership functions to fit the fuzzy evaluation measures is relatively easy.
Rules for combining attribute valuations are also developed from stakeholder interviews and discussions.
The rules are embodied in a Mamdani fuzzy inference system or fuzzy associative memory in the form
seen in Error: Reference source not found in section 4.4. The measures are used to improve the selection
of the SoS architecture within the genetic algorithm approach. The optimized architecture is then
proposed for implementation and negotiation between the component systems and SoS manager. The
negotiations require a reasonably good starting point to have any chance of success, and that is what this
research is designed to provide – the starting architecture for the agent based modeling part of the
problem. The system negotiations are the key to getting a realizable, implementable architecture for the
SoS, because the systems cannot be forced into the SoS in the case of an acknowledged SoS being
analyzed in this effort.
SERC 2015-TR-021-4.4 – Volume 4 56
Figure 12. Given an Evaluation Model that depends on a chromosome from the meta-architecture, the genetic algorithm can optimize within a multidimensional space of attributes
It is important to devise a method to visualize how the component systems’ capabilities (ci) of various
architecture instantiations come together to create the SoS capabilities (C). This helps during the level
setting exercises, but is vital to describing both the approach and the results to stakeholders as well.
Finally, there must be clear explanations of the limitations of the modeling approach. The numerous
simplifications mean that the model is not likely to match reality very well in detail; the best this model
can do is match in terms of broad, general trends in comparing high level architectural impacts between
different approaches to constructing the SoS.
For every SoS there will be requirements for component system and capability descriptions. Capabilities
of each system are denoted by ci, and the way those capabilities are combined into the SoS capability C
must be described and captured in the model. When the information is gathered and organized, then the
domain specific model is described, but the fact that there must be some way to build up the required
capability as a module in the model is domain independent. For our meta-architecture, there may be sets
of capabilities from each system, a combination algorithm to describe how the SoS capability is built up
from the systems, and costs for development of either new capabilities or interfaces, with schedules and
costs for operations of the systems in the SoS. More detailed models could be used, for example if the
cost of discovering and codifying new doctrine or tactics, and training in the new configurations is known
or can be estimated. Other desirable attributes and ways of measuring and combining them may be
discovered during the stakeholder discussions; these may be added to the SoS evaluation, but there must
be some process such as this regardless of domain. Initial draft runs of the model may also lead the
SERC 2015-TR-021-4.4 – Volume 4 57
Binary Meta-Architecture
Systems and their interfaces are present (1), or not (0)An instance of the meta-architecture is a "chromosome" representing one particular arcchitecture
Stakeholder Discussions
Facilitated interviews to draw out input data and value judgments from key stakeholdersModel building and validation iterations proceed toward consensus
Evaluation Model
Fuzzy SoS attributes created from stakeholder concerns, performance algorithms of collaborating systems, and advantages from interfacingFuzzy model can evaluate multiple attributes for each SoS chromosome to arrive at an overall SoS "fitness"
Genetic Algorithm
Can explore large volumes of the potential architecture spaceCan optimize with respect to many attributes using overall fuzzy fitness
modelers to changes and improvements in the model modules. This method of discovering the domain
dependent data that goes into the model is domain independent. It should work for any domain.
4.2.3.1 Choosing the SoS Key AttributesThe facilitators and model builders need some basic knowledge of the domain of the SoS. Without that,
they will not be able to ask intelligent questions of the stakeholders and SMEs. Conversely, if the
facilitators know too much about a domain, they may unconsciously pre-select a solution, biasing the way
they ask questions. An example questionnaire form for directing the interviews with stakeholders, is
shown in Error: Reference source not found. This is only a very high level starting point; it should be
adjusted for any specific SoS application.
Questions such as those in Error: Reference source not found are intended to elicit from the stakeholders
the key attributes (or key performance parameters (KPPs)) that they care about for the development and
use of the SoS. The questions are asked from the point of view, and with the intention, of developing
relatively simple evaluation algorithms that depend strongly on the participating systems and their
interfaces and how they interconnect in operation. At the initial stage of development, these algorithms
may be fairly approximate, using rough estimates. The goal is to have some kind of broadly feasible
architecture with which to begin the analysis leading to negotiations between SoS manager and individual
systems for an agreed to SoS. It is fully expected that individual systems’ performance, cost, schedule
and other attributes would be adjusted during negotiations. On the other hand, attribute evaluation
algorithms are designed to be modular, so that if better models become available, they may be substituted
in at any time. Well documented, traceable information trails are invaluable when reviewing, improving,
correcting, or extending the models (or the SoS itself). These documented traces are well described by
DoDAF style views.
4.2.3.2 Domain Model DataThere is a great deal of information potentially available about the components of the SoS – each system
may be complex in its own right. The architect must find a better way to share SoS data with analysts and
stakeholders than dozens or hundreds of columns of numbers. Color coded and textured graphs, multiple
and rotatable viewpoints into multi-dimensional data, slices, smart filtering, correlations, time series
analyses, and animations may all be used to aid the understanding of the very large sets of data produced
by SoS modeling (Yi, Kang, Stasko, & Jacko, 2007). Both large scale trends and significant but tiny
artifacts in the data must be easily and quickly discoverable in the way the data is conveyed to reviewers.
One needs to be careful of the color palette chosen to display results, since different display or printer
devices may represent them differently, sometimes in surprising ways. Some in the audience will usually
be color blind to various degrees, as well. One must be careful not to assume that color coding data
SERC 2015-TR-021-4.4 – Volume 4 58
artifacts makes them obvious to others. (For those interested, the website
http://www.vischeck.com/examples/ simulates the way colorblind people see for people with normal
color vision, and suggests alterations to color palettes that allow more people see an image in a similar
way).
4.2.4 Architecture Space ExplorationThis modeling method uses a genetic algorithm (GA) approach to explore the architecture space. A
population of chromosomes is evaluated and sorted to select the better ones for propagation to future
generations with genetic modifications. The ones and zeroes in the chromosomes are generated at
random during the first generation. This is normal GA procedure. However, to avoid getting an average
50% ones in the entire initial generation of chromosomes, a bias is applied to the random number
generator so that the probability of any bit in a chromosome of that generation being a one depends on the
chromosome’s position in the population. The number of chromosomes in a population for one
generation of the GA is variable. Typically a few tens to a few hundred chromosomes are used in the
population in each generation. For the initial generation, chromosome number 1 has only a few ones,
with mostly zeroes. The last chromosome in a population has mostly ones with only a few zeroes.
Typically, low numbers of ones in the chromosome (meaning participating systems and/or interfaces) is
associated with lower cost and lower performance. The other attributes could be better or worse,
depending on their definitions. Higher numbers of participating systems and interfaces are usually
associated with higher performance and higher cost (equation 24 and 25 in Error: Reference source not
found). Costs/affordability and overall performance are almost universally necessary SoS evaluation
attributes. Since there is normally a desire for higher performance and lower cost, one hopes for a sweet
spot between the extremes, where you can get adequate performance, and adequate affordability (nearly
the inverse of cost) as well as acceptable values of the other attributes.
A key feature of the method is to do an exploration of the architecture space with a few thousand sample
chromosomes, which cover a large range of participating systems and interfaces. Each of the
chromosome’s attribute evaluations is plotted against the membership functions for that attribute. The
membership function shapes and/or the algorithms for evaluating the attributes may need to be adjusted
several times in an iterative process that may include discussions with stakeholders to arrive in acceptable
SoS model. This is explained further in section 4.6
The meta-architecture and associated data model being proposed so far contains many features that mimic
real-life:
There may be multiple copies of the same system
SERC 2015-TR-021-4.4 – Volume 4 59
There may be slight differences between the otherwise similar systems
Each system may have multiple capabilities
There is a minimum number of component capabilities required to make up the SoS capability.
If a proposed architecture does not have the minimum capabilities, a penalty is tacked on to its
performance, to enhance the chances of discarding its chromosome in the fitness comparisons at each
generation of the genetic algorithm. No population member or bit position is pre-selected to be discarded
before evaluating it for all attributes.
4.3 Individual Systems’ Information
4.3.1 Cost, Performance and Schedule Inputs of Component SystemsThe models used here treat the cost of developing a new capability or adding an interface separately from
the cost of operating the system during a deployment of the SoS. In real life these costs are potentially
paid for from different pots of money, such as acquisition vs. operations budget lines in DoD, or current
vs. future funds. The development cost is normally a onetime cost, while the operations cost of the
system is a continuing cost each time the SoS is called into action. Performance enhancement is normally
on the order of a few percent for the modification requested to fit into the SoS. For adding an interface, it
might require a new radio and antennas to be installed on a vehicle, or extending the software data base of
messages that can be handled by an existing system on a vehicle. There can be significant costs for a
minor modification to accomplish retesting of functions that might be affected by any changes to fielded
systems (called regression testing), in addition to testing of the change itself. Whatever the change, in
addition to time to develop and test the change, the system hosting it will be ‘down for maintenance’
during the installation of the change. The time to develop, install and test a change is the development
time. This is generally one epoch, or time period, in the wave model. If a capability already exists, such
as ability to use a specific radio on a platform, the development cost and time for that system for that
capability will be zero. However, development of the other end of the interface on a different system may
still be required and will count toward the cost of the interface. Some complex modifications might take
two or more epochs to develop. In this case, since the development is not complete at the end of the first
epoch, it is as if the system chose not to participate, because it delivers no capability yet. However, one is
still spending funds on that development, for which one receives nothing until the development is
complete. The bottom of Error: Reference source not found shows a simplified template to gather the
estimated individual system cost and performance data.
SERC 2015-TR-021-4.4 – Volume 4 60
4.3.2 Membership FunctionsMembership functions (MF) map the fuzzy values to the real world values and show the fuzziness of the
boundaries between the granularities or grades within each attribute. The Matlab Fuzzy Toolbox has a
number of built in shapes for membership functions. Triangular, trapezoidal, and the Gaussian smoothed
corners of trapezoidal shapes are available among others; only the Gaussian rounded trapezoidal shape
shown in Figure 13 was used in this analysis. It is very common when evaluating large projects to have a
band of acceptability for each grade in each attribute. A familiar example is the Contractor Performance
Assessment Reporting System (CPARS). It assigns one of five colors to a series of common measures of
project status. It is also common to have multiple reviewers provide a grade in each area, which is then
averaged to get the final grade (Department of the Navy, 1997). This is intended to avoid the issue of
unconscious bias or error of interpretation of the data by a single reviewer on a borderline issue. This
process is very similar to the mathematically more precise fuzzy logic process. Other MF shapes show
similar characteristics, but the nonlinearities in the output surface display the concepts better with the
slightly rounded MF shapes shown in Figure 13. All the variables in the Fuzzy Tool Box are scaled the
same way. In real space, a further scaling is required to the individual variables. The MFs cross each
other at the 50% level between each of the numbers in the granularity scale from 1 to 4. For this fuzzy
inference system (FIS), 1 = Unacceptable, 2 = Marginal, 3= Acceptable, and 4 = Exceeds (expectations)
for each attribute: Performance, Affordability, (Developmental) Flexibility, and Robustness. There is no
requirement that the scaling be the same for different attributes. In fact, the Matlab Fuzzy Tool Box
allows the MFs to be scaled to real values, and it might have been clearer to use that facility, but the
graphical user interface (GUI) for changing the values is rather tedious, so a method to apply the scaling
outside the GUI was developed. The process of translating real values to fuzzy values is called
fuzzification or fuzzifying. Multiple criteria are combined through the rules in fuzzy space, and the
output fuzzy value is de-fuzzified to a crisp value for the SoS assessment. In this fuzzification scheme,
values are rounded to the nearest integer value for each fuzzy gradation. In the example in Figure 13, a
fuzzy value of 2.35 would round to 2, and not coincidentally, would fall on the sloping line for Marginal
membership at a value of about 65%, higher than on the line for Acceptable membership of about 35%.
The next section discusses how the real values are mapped to the fuzzy scale.
SERC 2015-TR-021-4.4 – Volume 4 61
Figure 13. Matlab Fuzzy Toolbox printout of membership function shapes used in this analysis
4.3.3 Mapping Attribute Measures to Fuzzy VariablesThe generic membership function range is the ‘universe of discourse.’ This typical range, shown in
section 4.3.2, must be mapped to the real world values of the domain specific SoS. The mapping can be
done inside Matlab so that Figure 13 would be scaled in real units, but that requires working in a tedious
GUI. It can also be done by mapping the key shape points of the scaled MF to the real world values. In a
real problem, this mapping of ranges for each attribute would come from the problem definition and the
stakeholders’ beliefs and desires discovered during the model building step of the method. Examples
could be estimated values for cost of the SoS, or performance in terms of square miles searched per hour,
or tons of freight delivered per day in another type of problem. The probability of success, or the number
of shipments, or other attributes would have desired thresholds that define the levels of performance in
each attribute, such as: unacceptable, marginal, acceptable, or exceeds expectations in a four part
granularity for each attribute. The judging criteria may take on a wide variety of terminology and of
forms, depending on the domain. Any degree of granularity is possible. An even number of gradations
were chosen in this instance to avoid the possibility of an evaluation question being answered in the
middle. Odd numbers of gradations tend to allow stakeholders to answer too many valuation questions
disproportionately in the middle during the interview process, while even numbers of gradations force the
choices to be above or below average. This depends on one’s problem and particular stakeholders, of
course.
Figure 14. shows a typical mapping between real world values on the left, and the fuzzy variable on the
bottom. Note that there is no requirement for the mappings to be linear. Figure 15shows affordability
and robustness mapped to their fuzzy values. All the attribute values need to be a matched set, for a
SERC 2015-TR-021-4.4 – Volume 4 62
matched set of attribute models. In this case, robustness depends on the range of the values of
performance. Therefore, if the maximum performance doubles due to a change in the model, then the real
world robustness map would need to change as well. The real world values for affordability are dollars,
and robustness is the maximum loss of performance when removing any system, but they are mapped as
negative values here, because less is better. This allows the fuzzy attributes to be plotted as
monotonically increasing. Minor kinks in the mapping lines show that the slopes of the membership
function maps do not need to be constant.
1 1.5 2 2.5 3 3.5 40
0.1
0.2
0.3
0.4
0.5
0.6
Performance
Figure 14. Map from fuzzy variable on horizontal axis to probability of detection on left
1 1 . 5 2 . 5 3 . 5 4
-300
-250
-200
-150
-100
-50
0
-250
-190-155
-110
-60
affordability
1 1 . 5 2 2 . 5 3 3 . 5 4
-0.8-0.7-0.6-0.5-0.4-0.3-0.2-0.1
0
-0.75
-0.45
-0.3
-0.15
0
robustness
Figure 15. Attribute values, mapped to fuzzy variables
SERC 2015-TR-021-4.4 – Volume 4 63
Unacceptable Marginal Adequate Exceeds
Likelihood of detection/day
4.3.4 Non-Linear Trades in Multiple Objectives of SoSFuzzy logic can be used to fit highly nonlinear surfaces even with a relatively small rule base. The
commonly cited problem of dimensionality for fuzzy logic systems in fitting arbitrarily large input sets
(Gegov, 2010) does not arise in this problem because the number of inputs are small – limited to the
KPAs of the SoS design problem. The combination of membership function shapes and combining rules
allows one to fit quite nonlinear surfaces in the several required dimensions of this problem.
Furthermore, the input variables are generally monotonic, increasing in value from the fuzzy value of
‘worst’ to ‘best.’ All the membership functions used in this effort (input and output) have been scaled
from 1 to 4 for simplicity of display in this document, but that scaling is purely arbitrary. The actual
scaling is through linguistic variables discovered through the interactions of the facilitator, SMEs, and
stakeholders. They are typically terms such as “very bad,” “good,” “excellent,” etc. For most attributes,
there is a further mapping of the linguistic terms, such as ‘excellent affordability is a cost between $8M
and $10M,’ or ‘acceptable affordability is cost between $10M and $12M.’ If the attribute evaluation
elements can be categorized in such fuzzy terms as this, then relatively simple rules for combining them
can result in a straightforward overall SoS evaluation from the resultant fuzzy inference system or fuzzy
rule based system.
Figure 16. Example of nonlinear SoS fitness versus Affordability and Performance, based on membership function shapes and combining rules
4.4 Combining SoS Attribute Values Into an Overall SoS MeasureA Mamdani fuzzy inference system allows the combination of as many input attributes as desired (Fogel,
2006). Each attribute is equivalent to an objective or dimension in a multi-objective optimization
problem. Gegov expanded this concept to include networks of fuzzy systems, to cover deep and
SERC 2015-TR-021-4.4 – Volume 4 64
complicated problems with many dimensions (Gegov, 2010), and uncertainties extending to Type II fuzzy
sets. Nevertheless, if rules of the form discussed below (which are symmetrical), are combined with rules
of the form ‘if attribute one and attribute four are excellent, but attribute five is marginal, then the SoS is
better-than-average,’ etc., which allows for asymmetry or non-uniform weighting among attributes, then
very complex evaluation criteria may be described for the SoS. Using membership function shapes other
than those shown in Figure 13 also allows considerable tuning of the mapping of input attribute values
(depending on the SoS architecture or chromosome structure in the model) to the output of the overall
SoS quality or fitness.
A Mamdani Type I fuzzy rule set may also be called a Fuzzy Associative Memory (FAM) to combine the
attribute values into the overall SoS fitness score. Attribute measures are converted to fuzzy variables
from the mappings explained in section 4.3.2, the rules are followed to form a fuzzy measure for the SoS
architecture (represented by a chromosome). That measure may be de-fuzzified back to a crisp value for
final comparison in the GA through an equivalent mapping in the output space. The rules should be kept
simple for two reasons: primarily it is easier for the analyst to understand and to explain them to the
stakeholders, but also because a few rules within the fuzzy logic system can be very powerful in defining
the shape of the resulting surface. Still, some sensitivity analysis can be done on the rule sets, and results
of minor changes in the rules may be displayed for comparison, all other things being kept the same.
Rules are typically of the form: ‘if all attributes are good, then the SoS is superb,’ ‘if all attributes except
one are superb, then the SoS is still superb,’ ‘if any attribute is completely unacceptable, then the SoS is
unacceptable.’ A dozen or so of these rules can give an excellent estimate of the stakeholders’ intentions,
including significant nonlinearities and complexity (Gegov, 2010). The Mamdani FIS allows satisfying
two contradictory rules simultaneously by simply including them both in the calculation of the resultant
output value.
The linguistic form of some of these rules may be easier to express than the mathematical form. For
example, ‘if any attribute is unacceptable, then the SoS is unacceptable’ can be expressed linguistically as
a single sentence, but mathematically a separate rule for each attribute is tested alone to implicate the
unacceptability of the SoS. If the rule can be expressed as a single sentence linguistically, as in Error:
Reference source not found, it will be counted as only one rule. The rules come out of linguistic analysis
of the stakeholder interviews, with some normative smoothing by the facilitator. At worst, if consensus
cannot be reached on a rule statement among the stakeholders, a version of the analysis with the rule
expressed both ways can be compared for sensitivity to that rule. This approach can also help explain the
issue to the stakeholders.
SERC 2015-TR-021-4.4 – Volume 4 65
Table 5 Example of a few powerful Fuzzy Inference Rules for combining attribute values
Five Plain Language Rule
If ANY single attribute is Unacceptable, then SoS is Unacceptable
If ALL of the attributes are Marginal, then the SoS is Unacceptable
If ALL the attributes are Acceptable, then the SoS is Exceeds
If (Performance AND Affordability ) are Exceeds, but (Dev. Flexibility and Robustness) are Marginal, then the SoS is Acceptable
If ALL attributes EXCEPT ONE are Marginal, then the SoS is still Marginal
RULES for ISR
If Clauses ThenIf all attributes are Unacceptable the SoS is UnacceptableIf all attributes are Marginal the SoS is UnacceptableIf all attributes are Acceptable the SoS is ExceedsIf Performance and Affordability are Exceeds
and Flexibility and Robustness are NOT Unacceptable
the SoS is Acceptable
If all attributes Except Robustness are Marginal
and Robustness is NOT Exceeds the the SoS is Marginal
If all attributes Except Flexibility are Marginal
and Flexibility is NOT Exceeds the the SoS is Marginal
If all attributes Except Affordability are Marginal
and Affordability is NOT Unacceptable
the SoS is Acceptable
The Surfaces look like this:
With Performance vs. Robustness looking the same as the figure on the right for Flexibility.
SERC 2015-TR-021-4.4 – Volume 4 66
RULES for SAR
If Clauses ThenIf all attributes are Unacceptable the SoS is UnacceptableIf all attributes are Marginal the SoS is UnacceptableIf all attributes are Acceptable the SoS is ExceedsIf Performance and Affordability are Exceeds
and Flexibility and Robustness are Marginal
the SoS is Acceptable
If all attributes Except Robustness are Marginal
and Robustness is Acceptable the the SoS is Marginal
If all attributes Except Flexibility are Marginal
and Flexibility is NOT Exceeds the the SoS is Unacceptable
If all attributes Except Affordability are Marginal
and Affordability is NOT Unacceptable
the SoS is Acceptable
Affordability and performance look the same as the previous rule set, but Flexibility and Robustness both change shape:
Flexibility depended on the distribution of capabilities among systems (set by the input domain data
sheet, which varied between domains), Robustness was the best remaining performance function after
removing each system - all it did was use the performance function repeatedly after removing each
system successively. Affordability also depended on the input domain data, also adjusted independently
for each domain.
4.5 Exploring the SoS Architecture Space with Genetic AlgorithmsHaving developed a method of evaluating architectures based on presence or absence of any combination
of systems and interfaces within the meta-architecture, this evaluation may be used as the fitness measure
for selection for propagation to a new generation within an evolutionary algorithm. One class of
SERC 2015-TR-021-4.4 – Volume 4 67
evolutionary algorithm is the genetic algorithm (GA). The key feature of a GA approach is to evaluate
the overall fitness of a series of chromosomes in a ‘population.’ One then sorts the chromosomes by their
fitness, and proceeds to a next generation through mutations, crossovers, or ‘sexual reproduction’ of a
fraction of the better fitness chromosomes in that generation. Mutation rates, crossover points, special
rules for certain sections of the chromosome (genes), or deciding which parents are combined, can all be
varied as part of the GA approach.
The GA first generation starts with a population of random arrangements of chromosomes built from the
meta-architecture, which spans the search space, then sorts them by fitness. A fraction of the better
performing chromosomes is selected for propagation to the next generation through mutation and/or
transposition. A few poorly performing chromosomes may also be included for the next generation, to
avoid the danger of becoming stuck on a purely local optimum, although proper selection of mutation and
transposition processes can also help avoid this problem.
4.6 Combining the Fuzzy Approach with the GA ApproachIn order that the GA work with any string of bits within the meta architecture, the algorithms for
evaluating each attribute must work for any string of bits. The results of individual attribute evaluations
may take on a large range of values. When the desired and tradable values of the attributes, and the
algorithms for evaluating them, are determined from the SoS stakeholder interviews, the range of values
of each attribute is pre-determined. The entire range of values is the ‘universe of discourse.’ In each
dimension or attribute, the entire range is mapped contiguously to the granularity described by the
membership functions. There is no guarantee that any arrangement of systems and interfaces will be
found to be acceptable. Because this effort was to develop and explore the method, and the example SoS
were largely fictional, all the model parameters could be adjusted to find examples that would work. The
key to this adjusting process was to plot the attribute evaluations against the number of ones in the
chromosome.
Biasing the random number generator to produce a population of chromosomes with varying numbers of
ones allowed an exploration of chromosomes from various regions of the meta-architecture. By iterating
adjustments of the attribute membership function edges against a population of randomly generated (but
biased in the number of ones) initial populations of chromosomes, an acceptable picture of the SoS
behavior could be determined.
SERC 2015-TR-021-4.4 – Volume 4 68
Figure 17. Exploring the meta-architecture - 25 chromosomes, 22 systems, Example 1
Figure 18. Exploring the meta-architecture to map membership function edges, Example 2
When a few hundred chromosomes are present in the exploratory population, one can get a very good
idea of the shape of the behavior of the meta-architecture space as a function of the number of interfaces
between systems of the SoS, shown in Figure 19. More systems and interfaces generally leads to more of
SERC 2015-TR-021-4.4 – Volume 4 69
all the attributes: performance, flexibility, robustness, but to more cost as well (= less affordable).
However, one can also see that the trends are noisy, and not perfectly correlated in the ISR model shown
in Figure 19. The exploration phase allows the setting of the membership function edges to take
advantage of the variability in the evaluation functions to drive the GA search toward regions that look
more likely to produce a decent compromise from among the competing attributes.
Figure 19. Exploring large populations to set the membership function edges
One needs to be in a reachable region of the SoS attribute space, or the universe of discourse, defined by
the membership function edges in the fuzzy inference system. It is of little value to have an architecture
that produces $100M solutions when the only acceptable value is less than $50M. Therefore, some level
setting of expectations, tuning of algorithms, and of the input domain data may all be necessary to reach a
reasonable “space” within which to attempt optimization with the GA. This is the function of the
exploration phase of the process.
5 Concluding RemarksThe research presented has been based on Type-I fuzzy sets. There are certain benefits of using a type-2
Fuzzy Logic system for our particular research problem. Type-2 fuzzy sets allow us to handle linguistic
uncertainties, as typified by the adage “words can mean different things to different people” (Karnik,
Mendel & Liang, 1999). The benefits achieved are mentioned as follows: During collection of rules by
surveying experts, it is very likely to get diverse answers from each expert, based on the locations and
spreads of the fuzzy sets associated with antecedent and consequent terms. This leads to statistical
uncertainties about locations and spreads of antecedent and consequent fuzzy sets. Such uncertainties can
be incorporated into the descriptions of these sets using type-2 membership functions. In addition, experts
often give different answers to the same rule-question; this results in rules that have the same antecedents
SERC 2015-TR-021-4.4 – Volume 4 70
but different consequents. In such a case, it is also possible to represent the output of the FLS built from
these rules as a fuzzy set rather than a crisp number. This can also be achieved within the type-2
framework. Finally the uncertainty during training also exists on both the antecedents and consequents. If
we have information about the level of uncertainty, it can be used when we model antecedents and
consequents as type-2 sets. This is presented in the Volume 12.
Cited and Related References [email protected]. (2009, Jan 12). Joint Capability Areas. Washington DC.
Abo-Sinna, M. A., & Baky, I. A. (2007). Interactive balance space approach for solving multi-level multi-objective programming problems. Information Sciences, 177, 3397–3410.
Abraham, A., & Jain, L. (2005). Evolutionary Multiobjective Optimization. Evolutionary Multiobjective Optimization, 1-6.
Acheson, P. (2010). Methodology for Object Oriented System Architecture Development. IEEE Systems Conference.
Acheson, P., Dagli, C., & Kilicay-Ergin, N. (2013, March). Fuzzy Decision Analysis in Negotiation between the System of Systems Agent and the System Agent in an Agent Based Model. International Journal of Soft Computing and Software Engineering [JSCSE]. doi:10.7321/jscse
Acheson, P., Pape, L., Dagli, C., Kilcay-Ergin, N., Columbi, J., & Haris, K. (2012). Understanding System of Systems Development Using an Agent-Based Wave Model. Complex Adaptive Systems, Publication 2. 12, pp. 21-30. Washington D.C.: Procedia Computer Science.
Agarwal, S., & Dagli, C. H. (2013). Augmented Cognition in Human–System Interaction through Coupled Action of Body Sensor Network and Agent Based Modeling. Conference on System Engineering Research, (pp. 20-28).
Agarwal, S., Pape, L. E., & Dagli, C. H. (2014). GA and PSO Hybrid with Type-2 Fuzzy Sets for Generating Systems of Systems Architectures. Procedia Computer science.
SERC 2015-TR-021-4.4 – Volume 4 71
Agarwal, S., Saferpour, H. R., & Cihan, C. H. (2014). Adaptive Learning Model for Predicting Negotiation Behaviors through Hybrid K-means Clustering, Linear Vector Quantization and 2-Tuple Fuzzy Linguistic Model. Procedia Computer Science, 36, 285-292.
Agarwal, S., Wang, R., & Dagli, C. H. (2014). Executable Architectures Using Cuckoo Search Optimization Coupled with OPM and CPN-A Module: A New Meta-Architecture Model for FILA SoS. In Complex Systems Design & Management (pp. 175-192). Paris: Springer International Publishing.
Ahn, J.-H. (2012). An Archietcture Description method for Acknowledged System of Systems based on Federated Architeture. Advanced Science and Technology Letters 05.
Alberts, D. S. (2011). The Agility Advantage: A Survival Guide for Complex Enterprises and Endeavors. Washington DC: Center for Advanced Concepts and Technology.
Alberts, D. S., Garstka, J. J., & Stein, F. P. (1999). Network Centric Warfare: Developing and Leveraging Information Superiority, 2nd Edition. Washington DC: C4ISR Cooperative Research Program.
Alfaris, A. (2009). The Evolutionary Design Model (EDM) for the Design of Complex Engineered Systems: Masdar City as a Case Study. Massachusetts Institute of Technology.
Alves, M. J., Dempe, S., & Júdice, J. J. (2012). Computing the pareto frontier of a bi-objective bi-level linear problem using a multiobjective mixed-integer programming algorithm. Optimization, 61(3), 335–358.
Amberg, M. (1996). Modeling Adaptive Workflows in Distributed Environments. Proc. of the 1st Int. Conf. on Practical Aspects of Knowledge Management. Basel.
Anandalingam, G., & Friesz, T. L. (1992). Hierarchical optimization: An introduction. Annals of Operations Research, 34(1), 1-11.
Arnold, A., Boyer, B., & Legay, A. (2012). Contracts and Behavioral Patterns for System of Systems: The EU IP DANSE Approach.
Arnold, A., Boyer, B., & Legay, A. (2012). Contracts and Behavioral Patterns for System of Systems: The EU IP DANSE Approach.
ASD(NII), D. (2009). DoD Architecture Framework Version 2.0 (DoDAF V2.0). Washington DC: Department of Defense.
ASD(NII), D. (2010). DoD Architecture Framework Version 2.02 (DoDAF V2.02). Washington DC: Department of Defense.
ASN/RDA. (2009). Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook, Ver. 1. Washington DC: Department of the Navy.
Bac, M., & Raff, H. (1996). Issue-by-issue negotiations: the role of information and time preference. Games and Economic Behavior, 125-134.
Baky, I. A. (2009). Fuzzy goal programming algorithm for solving decentralized bi-level multi-objective programming problems. Fuzzy Sets and Systems, 160, 2701–2713.
Baky, I. A. (2010). Solving multi-level multi-objective linear programming problems through fuzzy goal programming approach. Applied Mathematical Modelling, 34, 2377–2387.
Bergey, J. K. (2009). US Army Workshop on Exploring Enterprise, System of Systems, System, and Software Architectures.
Blanchard , B. S., & Fabrycky , W. J. (2010). Systems Engineering and Analysis. Upper Saddle River, NJ: Prentice Hall.
SERC 2015-TR-021-4.4 – Volume 4 72
Bonabeau, E. (2002). Agent based Modeling: Methods and Techniques for Simulating Human Systems,. Proceedings of the National Academy of Sciences, Vol. 99, 7280-7287.
Bouleimen, K., & Lecocq, H. (2003). A new efficient simulated annealing algorithm for the resource-constrained project scheduling problem and its multiple mode version. European Journal of Operational Research, 149(2), 268-281.
Bradley, S. P., Hax, A. C., & Magnant, T. L. (1977). Applied Mathematical Programming. Addison-Wesley.
Brucker, P., Drexl, A., Möhring, R., Neumann, K., & Pesch, E. (1999). Resource Constrained Project Scheduling: Notation, Classification, Models, and Methods. European Journal of Operational Research, 112(1), 2-41.
Cai, Z., & Wang, Y. (2012). Combining Multiobjective Optimization with Differential Evolution to Solve Constrained Optimization Problems. IEEE Transactions on Evolutionary Computation, 1(16), 117-134.
Cara, A. B., Wagner, C., Hagras, H., Pomares, H., & Rojas, I. (2013). Multiobjective Optimization and comparison of Nonsingleton Type-1 and Singleton Interval Type-2 Fuzzy Logic Systems. IEEE Transactions on Fuzzy Systems, 21(3), 459-476.
Chattopadhyay, D., Ross, A. M., & Rhodes, D. H. (2008). A framework for tradespace exploration of systems of systems. 6th Conference on Systems Engineering Research. Los Angeles CA.
Christian III, J. A. (2004). A Quantitative Approach to Assessing System Evolvability. Houston: NASA Johnson Space Center.
CJCSI 6212.01F. (12 Mar 2012). Net Ready Key Performance Parameter (NR KPP). Washington DC: US Dept of Defense.
Cloutier, R. J., Dimario, M. J., & Polzer, H. W. (2009). Net Centricity and System of Systems. In M. Jamshidi, System of Systems Engineering (pp. 150-168). Hoboken NJ: John Wiley & Sons.
Clune, J., Mouret, J.-B., & Lipson, H. (2013). The evolutionary origins of modularity. Proceedings of the Royal Society - B, 280, 2863.
Coello, C. A., & Lamont, G. B. (2004). An Introduction to Multi-Objective Evolutionar Algorithms and Their Applications. Applications of Multi-Objective Evolutionary Algorithms, 1, 1-28.
Coello, C. A., & Mezura-monte, E. (2005). A Simple Multimembered Evolution Strategy to Solve Constrained Optimization Problems. IEEE Transactions on Evolutionary Computation, 9(1), 1-17.
Cohon, J. L. (1985). Multicriteria Programming: Brief Review and Application. In G. J. S (Ed.). New York: Academic Press.
Coleman, J. W., Malmos, A. K., Larsen, P. g., Peleska, J., Hains, R., Andrews, Z., . . . Didier, A. (2012). COMPASS Tool Vision for a System of Systems Collaborative Development Environment. Proceedings of the 7th International Conference on System of System Engineering, IEEE SoSE 2012.
Contag, G., Laing, C., Pabon, J., Rosenberg, E., Tomasino, K., & Tonello, J. (2013). Nighthawk System Search and Rescue (SAR) Unmanned Vehicle (UV) System Development. SE4150 Design Project, Naval Postgraduate School.
Crossley, W. A., & Laananen, D. H. (1996). Conceptual Design of Helicopters via Genetic Algorithm. Journal of Aircraft, 33(November-December), 1062-1070.
Dagli, C. H., & Wang, R. (2013). Developing a holistic modeling approach for search-based system architecting. Procedia Computer Science, 16, 206-215.
SERC 2015-TR-021-4.4 – Volume 4 73
Dagli, C. H., Singh, A., Dauby, J., & Wang, R. (2009). Smart Systems Architecting: Computational Intelligence Applied to Trade Space Exploration and System Design. Systems Research Forum, 3(2), 101–120.
Dagli, C., Ergin, N., Enke, D., Giammarco, K., Gosavi, A., Qin, R., . . . Pape, L. (2013). An Advanced Computational Approach to System of Systems Analysis & Architecting using Agent Based Behavioral Modelin. Systems Engineering Research Center. Hoboken NJ: SERC.
Dahmann, J. (2012). INCOSE SoS Working Group Pain Points. Proc TTCP-JSA-TP4 Meeting. Retrieved from http://www.ndia.org/Divisions/Divisions/SystemsEngineering/Documents/NDIA-SE-MS-SoS_2013-08-20_Dahmann.pdf
Dahmann, J. (2014). System of System Pain Points. INCOSE International Symposium. Las Vegas: INCOSE.
Dahmann, J., Baldwin, K. J., & Rebovich, G. (2009). Systems of Systems and Net-Centric Enterprise Systems. 7th Annual Conference on Systems Engineering Research. Loughborough.
Dahmann, J., Lane, J., Rebovich, G., & Baldwin, K. (2008). A model of systems engineering in a system of systems context. Proceedings of the Conference on Systems Engineering Research. Los Angeles CA.
Dahmann, J., Rebovich, G., Lane, J. A., Lowry, R., & Baldwin, K. (2011). An Implementers’ View of Systems Engineering for Systems of Systems. Proceedings of IEEE International Systems Conference. Montreal.
Dauby, J. P., & Dagli, C. H. (2011). The canonical decomposition fuzzy comparative methodology for assessing architectures. IEEE Systems Journal, vol. 5, no. 2, 244-255.
Dauby, J. P., & Upholzer, S. (2011). Exploring Behavioral Dynamics in Systems of Systems. In C. (. Dagli, Complex Adaptive Systems, Procedia Computer Science, vol 6 (pp. 34-39). Elsevier.
Deb, K. (2000). An efficient constraint handling method for genetic algorithms. Computer Methods Applied Mechanics and Engineering(186), 311-338.
Deb, K. (2001). Multi-objective Optimization using Evolutionary Algorithms. 16(Wiley).
Deb, K., & Gupta, H. (2006). Introducing Robustness in Multi-Objective Optimization. Evolutionary Computation, 14(4), 463-494.
Deb, K., & Sinha, A. (2009). Solving bilevel multi-objective optimization problems using evolutionary algorithms. In M. F.-K. Ehrgott, Evolutionary Multi-Criterion Optimization. Lecture Notes in Computer Science (pp. 110–124). Berlin: Springer.
DeLaurentis, D., Marais, K., Davendralingam, N., Han, S. Y., Uday, P., Fang, Z., & Gurainiello, C. (2012). Assessing the Impact of Development Disruptions and Dependencies in Analysis of Alternatives of System-of-Systems. Hoboken NJ: Stevens Institute of Technology, Systems Engineering Research Center.
Department of the Navy. (1997). Contractor Performance Assessment Reporting System (CPARS). Washington DC.
Diaz, E., Tuya, J., & Blanco, R. (2003). Automated Software Testing Using a Metaheuristic Technique Based on Tabu Search. Proceedings of the 18th IEEE International Conference on Automated Software Engineering , (pp. 310–313).
Díaz, E., Tuya, J., Blanco, R., & Dolado, J. J. (2008). A Tabu Search Algorithm for Structural Software Testing. Computers & Operations Research, 35(10), 3052–3072.
Director Systems and Software Engineering, OUSD (AT&L). (2008). Systems Engineering Guide for Systems of Systems. available from http://www.acq.osd.mil/se/docs/SE-Guide-for-SoS.pdf.
Djavanshir, G. R., Alavizadeh, A., & Tarokh, M. J. (2012). From System-of-Systems to Meta-Systems: Ambiguities and Challenges. In A. V. Gheorghe, System of Systems.
SERC 2015-TR-021-4.4 – Volume 4 74
Dolado, J. J. (2000). A Validation of the Component-Based Method for Software Size Estimation. IEEE Transactions on Software Engineering, 26(10), 1006 –1021.
Dolado, J. J. (2001). On The Problem of the Software Cost Function. Information and Software Technology, 43(1), 61–72.
Dombkins, D. (2007). Complex Project Management. South Carolina: Booksurge Publishing.
Dombkins, D. H. (1996). Project Managed Change: The Application of Project Management Techniques to Strategic Change Programs. Australian Graduate School of Management, Center for Corporate Change. Sydney: University of New South Wales. Retrieved from www.academia.edu/4978870/Wave_Planning_-_Project_Managed_Change_1996_
Dudek, G., Jenkin, M. R., Milios, E., & Wilkes, D. (1996). A taxonomy for multi-agent robotics. Autonomous Robots, 3(4), 375-397.
Dudenhoeffer, D., & Jones, M. (2000). A Formation Behavior for Large Scale Micro-Robot Force Deployment. Proceedings of the 32nd Conference on Winter Simulation.
Dutta, P. K. (1999). Strategies and Games Theory & Practice. The MIT Press.
Eichfelder, G. (2010). Multiobjective bilevel optimization. Mathematical Programming, 123(2), 419-449.
Epperly, T. G. (1995). Global Optimization of Nonconvex Nonlinear Programs Using Parallel Branch and Bound. Citeseer.
Faratin, P., Sierra, C., & Jennings, N. R. (1998). Negotiation decision functions for autonomous agents. Robotics and Autonomous Systems 24, 159-182.
Farmani, R., & Wright, J. A. (2003). Self-Adaptive Fitness Formulation for Constrained Optimization. IEEE Transactions on Evolutionary Computation, 7(5), 445-455.
Fatima, S. S., Wooldridge, M., & Jennings, N. (2004). An Agenda-based Framework for Multi-issue negotiation. Artificial Intelligence, 152, 1-45.
Flanagan, D., & Brouse, P. (2012). System of Systems Requirements Capacity Allocation. Procedia Computer Science, 8, 112-117.
Fleming, P. J., & Fonseca, C. M. (1995). An overview of evolutionary algorithms in multiobjective optimization. Evolutionary Computation, 3(2), 1-16.
Fleming, P. J., & Fonseca, C. M. (1998). Multi-objective Optimization and Multiple Constraint Handling with Evolutionary Algorithms-Part I. IEEE Transactions on Systems, Man, and Cybernetics- Part A: Systems and Humans, 28(1), 26-37.
Fogel, D. (2006). Evolutionary Computation. New Jersey: John Wiley and Sons.
Fry, D. N., & DeLaurentis, D. A. (2011). Measuring Net-Centricity. Proceedings of the 6th International Conference on System of Systems Engineering. Albuquerque.
Gao, J., Buldyrev, S. V., Stanley, H. E., & Havlin, S. (2011, January). Networks formed from interdependent networks. Nature Physics, 8, 40-48. doi:10.1038/NPHYS2180
Gao, Y., Zhang, G., & Lu, J. (2009). A fuzzy multi-objective bilevel decision support system. International Journal of Information Technology and Decision Making, 8(1), 93–108.
Garrett, R. K., Anderson, S., Baron, N. T., & Moreland, J. D. (2009). Managing the Interstitials, a System of Systems Framework Suited for the Ballistic Missile Defense System. Systems Engineering, 14(1), 87-109.
SERC 2015-TR-021-4.4 – Volume 4 75
Garvey, P. R., & Pinto, C. A. (2009). Introduction to functional dependency network analysis. Second International Symposium on Engineering Systems. Cambridge MA: MIT.
Ge, B., Hipel, K. W., Wang, K., & Chen, Y. (2013). A Data-centric Capability-Focused Approach for System-of-Systems Architecture Modeling and Analysis. Systems Engineering, 16(3), 363-377.
Gegov, A. (2010). Fuzzy Networks for Complex Systems. Berlin: Springer.
Giachetti, R. E. (2012). A Flexible Approach to Realize an Enterprise Architecture. Procedia Computer Science, 8, 147-152.
Glover, F. (1989). Tabu Search—Part I. ORSA Journal on Computing, 1(2), 190–206.
Glover, F. (1989). Tabu Search—Part II. ORSA Journal on Computing, 2(1), 4-32.
Goldberg, D. E. (1989). Genetic Algorithms in Search, Optimization, and Machine Learning. Machine Learning, 3(2-3), 95-99.
Gonzalez-Zugasti, J. P., Otto, K. N., & Baker, J. D. (2000). A Method for Architecting Product Platforms. Research in Engineering Design(12), 61-72.
Gries, M. (2004). Methods for Evaluating and Covering the Design Space during Early Design Development. Integration, the VLSI Journa, 38(2), 131-183.
Guariniello, C., & DeLaurentis , D. (2014). Communications, information, and cyber security in Systems-of-Systems: Assessing the impact of attacks through interdependency analysis. Procedia Computer Science; Conference on Systems Engineering Research. Redondo Beach CA: Elsevier.
Haimes, Y. Y. (2012). Modeling Complex Systems of Systems with Phantom System Models. Systems Engineering, 15(3), 333-346.
Han, S. Y., & DeLaurentis, D. (2013). Development Interdependency Modeling for System-of-Systems (SoS) using Bayesian Networks: SoS Management Strategy Planning. Procedia Computer Science, Volume 16, 698–707. Atlanta.
Hansen , P., Jaumard , B., & Savard, G. (1992). New branch-and-bound rules for linear bilevel programming. SIAM Journal on Scientific and Statistical Computing, 13(5), 1194–1217.
Hassan, R., & Crossley, W. (2004). Discrete Design Optimization under Uncertainty: A Generalized Population-Based Sampling Genetic Algorithm. AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics & Materials Conference. Palm Springs.
Hassan, R., de Weck, O., & Springmann, P. (2004). Architecting a Communication Satellite Product Line. 22nd International Communication Satellite Systems Conference. Monterey, CA.
He, Z., Yen, G. G., & Zhang, J. (2013). Fuzzy-Based Pareto Optimality for Many-Objective Evolutionary Algorithms. IEEE Transactions of Evolutionary Computation, PP(99), 1-16.
Henson, S. A., Henshaw, M. J., Barot, V., Siemieniuch, C. E., Dogan, H., Lim, S. L., . . . DeLaurentis, D. (2013). Towards a Systems of Systems Engineering EU Strategic Research Agenda. IEEE International Conference on System of Systems Engineering. Maui.
Herroelen, W., De Reyck, B., & Demeulemeester, E. (1998). Resource-constrained project scheduling: a survey of recent developments. Computers & Operations Research, 25(4), 279-302.
Hill climbing. (n.d.). Retrieved October 24, 2013, from http://en.wikipedia.org/wiki/Hill_climbing
SERC 2015-TR-021-4.4 – Volume 4 76
Holland, J. H. (1973). Genetic Algorithm and the Optimal Allocation of Trials. SIAM Journal on Computing, vol 2, June 1973, 88-105.
Hunt, B., Lipsman, R. L., & Rosenberg, J. M. (2001). A guide to MATLAB: For beginners and experineced users. New York: Cambridge University Press.
III, J. A. (2004). A Quantitative approach to Assessing System Evolvability. Systems Engineering, 770, 1-16.
INCOSE. (2011). SYSTEMS ENGINEERING HANDBOOK v. 3.2.2. San Diego: INCOSE.
Jackson, S., & Ferris, T. L. (2013). Resilience Principles for Engineered Systems. Systems Engineering, 16(2), 152-164.
Jackson, S., & Ferris, T. L. (2013). Resilience Principles for Engineered Systems. Systems Engineering, 16(2), 152-164.
Jia, L., Wang, Y., & Fan, L. (2011). Uniform Design Based Hybrid Genetic Algorithm for Multiobjective Bilevel Convex Programming. Proceedings of the Seventh International Conference on Computational Intelligence and Security, (pp. 159-163).
Johnston, W., Mastran, K., Quijano, N., & Stevens, M. (2013). Unmanned Vehicle Search and Rescue Initiative. SE4150 Design Project, Naval Postgraduate School.
Joint Staff. (2010). CJCSM 3500.04C, UNIVERSAL JOINT TASK LIST (UJTL). Washington DC: Department of Defense.
Jonker, C. M., Robu, V., & Treur, J. (2007). An Agent Architecture for Multi-attribute Negotiation Using Incomplete Preference Information. Auton Agent Multi-agent System, 15, 221-252.
Karnik, N. N., Mendel, J. M., & Liang, Q. (1999). Type-2 fuzzy logic systems.Fuzzy Systems, IEEE Transactions on, 7(6), 643-658.
Kilicay Ergin, N., & Dagli, C. H. (2008). In M. (. Jamshidi, System of Systems: Innovations for the 21st Century (pp. 77-100). Wiley & Sons, Inc.
Kilicay-Ergin, N., Enke, D., & Dagli, C. (2012). Biased trader model and analysis of financial market dynamics. International Journal of Knowledge-based Intelligent Engineering Systems, Vol. 16, 99-116.
Kinnunen, M. J. (2006). Complexity Measures for System Architecture Models. Cambridge: MIT: System Design and Management Program Masters Thesis.
Kirkpatrick;, S., Gelatt, C. D., & Vecchi, M. P. (1983). Optimization by Simulated Annealing. Science, 220(4598), 671–680.
Koch, P. N., Evans, J. P., & Powell, D. (2002). Interdigitation for Effective Design Space Exploration Using iSIGHT. Structural and Multidisciplinary Optimization, 23(2), 111–126.
Konur, D., & Dagli, C. (2014). Military system of systems architecting with individual system contracts. Optimization Letters, October.
Konur, D., & Golias, M. M. (2013). Cost-stable truck scheduling at a cross-dock facility with unknown truck arrivals: A meta-heuristic approach. Transportation Research Part E, 49(1), 71-91.
Kraus, S. (1996). An Overview of Incentive Contracting. Artificial Intelligence, 83, 297-346.
Kraus, S. (2001). Automated negotiation and decision making in multiagent environments. Easss 2086, 150-172.
SERC 2015-TR-021-4.4 – Volume 4 77
Krothapali, N. K., & Deshmukh, A. V. (1999). Design of Negotiation Protocols for Multi-agent Manufacturing Systems. International Journal of Production Research, 37(7), 1601-1624.
Lafleur, J. M. (2012). A Markovian State-SpaceFramework for Integrating Flexibility into Space System Design Decisions. Atlanta: Georgia Institute of Technology School of Aerospace Engineering Doctoral Thesis.
Lamar, B. W. (2009). Min-additive utility functions. MITRE Corporation. MITRE Corporation. Retrieved from http://www.mitre.org/sites/default/files/pdf/09_0383.pdf
Lane, J. A., & Bohn, T. (2012). Using SysML Modeling to Understand and Evolve System of Systems. Systems Engineering, 16(1), 87-98.
Li, C., & Chiang, T.-W. (2013). Complex Neurofuzzy ARIMA Forecasting - A New Approach Using Complex Fuzzy Sets. IEEE Transactions on Fuzzy Systems, 21(3), 567-584.
Li, M., Lin, D., & Wang, S. (2010). Solving a type of biobjective bilevel programming problem using NSGA-II. Computers and Mathematics with Applications, 59(2), 706-715.
Lin, Y., Cunningham III, G. A., Coggshall, S. V., & Jones, R. D. (1998, September). Nonlinear System Input Structure Identification: Two Stage Fuzzy Curves and Surfaces. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 28(5), 678- 684.
Lopes, F., Wooldridge, M., & Novais, A. C. (2008). Negotiation among autonomous computational agents: principles, analysis and challenges. Artificial Intelligence Review 29, 1-44.
Lu, J., Zhang, G., & Dillon, T. (2008). Fuzzy multi-objective bilevel decision making by an approximation kth-best approach. Multiple-Valued Logic and Soft Computing, 14, 205–232.
Luzeaux, D. (2013). System-of-Systems and Large-Scale Complex Systems Architecting. Complex Systems Design & Management. Paris. Retrieved from http://www.csdm2013.csdm.fr/IMG/pdf/2nd_keynote_Speaker_-_Dominique_LUZEAUX.pdf
Maier, M. W., & Rechtin, E. (2009). The Art of Systems Architecting, 3rd ed. Boca Raton: CRC Press.
Malan, R., & Bredemeyer, D. (2001, August). Software Architecture Resources. (BREDEMEYER CONSULTING) Retrieved Dec 2014, from http://www.bredemeyer.com/pdf_files/NonFunctReq.PDF
Maskin, E. S. (1996). Theories of the Soft Budget Constraint. Japan & the World Economy, 8, 125-133.
Masud, A. S., & Hwang, C. L. (1979). Multiple Objective Decision Making – Methods and Applications. In Lecture Notes in Economics and Mathematical Systems (Vol. 164). Springer.
Mekdeci, B., Shah, N., Ross, A. M., Rhodes, D. H., & Hastings, D. (2014). Revisiting the Question: Are Systems of Systems just (traditiponal) Systems or are they a new class of Systems? Cambridge, MA: Systems Engineering Advancement Research Initiative (SEAri).
Mendel, J. M. (2013). On KM Algroithms for Solving Type-2 Fuzzy Set Problems. IEEE Transactions on Fuzzy Systems, 21(3), 426-446.
Mendel, J. M., & John, R. I. (2002). Type-2 Fuzzy Sets Made Simple. IEEE Transactions on Fuzzy Systems, 10(2), 117-127.
Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (1953). Equation of State Calculations by Fast Computing Machines. The Journal of Chemical Physics, 21(6), 1087–1092.
Michalewicz, Z., & Schoenauer, M. (1996). Evolutionary Algorithms for Constrained Parameter Optimization Problems. Evolutionary Computation, 4(1), 1-32.
SERC 2015-TR-021-4.4 – Volume 4 78
Miettinen, K. M. (1999). Nonlinear Multi-Objective Optimization. Boston: luwer Academic Publishers.
Migdalas, A., Pardalos, P. M., & Värbrand, P. (1998). Multilevel Optimization: Algorithms and Applications. Dordrecht: Kluwer Academic Publishers.
Min, B., & Chang, S. H. (1991). System Complexity Measure in the Aspect of Operational Difficulty. IEEE Transactions Nuclear Science, 38(5), 1035-1039.
Mordecai, Y., & Dori, D. (2013). I5: A Model-Based Framework for Architecting System-of-Systems Interoperability, Interconnectivity, Interfacing, Integration, and Interaction. International Symposium of the International Council on Systems Engineering (INCOSE). Philadelphia.
Mostafavi, A., Abraham, D. M., Noureldin, S., Pankow, G., Novak, J., Walker, R., . . . George, B. (2012). "Risk-based Protocol for the Inspection of Transportation Construction Projects undertaken by State Departments of Transportation. Journal of Construction Engineering & Management, ASCE, 139(8), 977-986.
Osman, M. S., Abo-Sinna, M. A., Amer, A. H., & Emam, O. E. (2004). A multi-level non-linear multi-objective decision-making under fuzziness. Applied Mathematics and Computation, 153(1), 239-252.
Pape, L., & Dagli, C. (2013). Assessing robustness in systems of systems meta-architectures. Complex Adaptive Systems. Baltimore.
Pape, L., & Dagli, C. (2013). Assessing robustness in systems of systems meta-architectures. Procedia Computer Science, Complex Adaptive Systems. 20, pp. 262-269. Baltimore: Elsevier.
Pape, L., Agarwal, S., Giammarco, K., & Dagli, C. (2014). Fuzzy Optimization of Acknowledged System of Systems Meta-architectures for Agent based Modeling of Development . Conference on Systems Engineering Research. Redondo Beach CA.
Pape, L., Giammarco, K., Colombi, J., Dagli, C., Kilicay-Ergin, N., & Rebovich, G. (2013). A fuzzy evaluation method for system of systems meta-architectures. Procedia Computer Science, Conference on Systems Engineering Research (CSER’13). 16, pp. 245-254. Atlanta: ScineceDirect, Elsevier.
Parsons, S., & Wooldridge, M. (2000). Languages for Negotiation. Proceedings of the Fourteenth European Conference on Artificial Intelligence (ECAI-2000).
Pedrycz, W., Ekel, P., & Parreiras, R. (2011). Fuzzy Multicriteria Decision Making; Models, Methods and Applications. West Sussex: John Wiley & Sons.
Pieume , C. O., Marcotte , P., Fotso , L. P., & Siarry. , P. (2011). Solving bilevel linear multiobjective programming problems. American Journal of Operations Research, 1, 214–219.
Pitsko, R., & Verma, D. (2012). Principles for Architecting Adaptable Command and Control. New Challenges in Systems Engineering and Architecting. St. Louis.
Räihä, O. (2008). Applying Genetic Algorithms in Software Architecture Design. University of Tampere.
Ravindran, A., Ragsdell, K. M., & Reklaitis, G. V. (2006). Engineering Optimization Methods and Applications. Hoboken, NJ: John Wiley & Sons.
Rela, L. (2004). Evolutionary Computing in Search-Based Software Engineering. Lappeenranta University of Technology.
Ricci, N., Ross, A. M., Rhodes, D. H., & Fitzgerald, M. E. (2013). Considering Alternative Strategies for Value Sustainment in Systems-of-Systems (Draft). Cambridge MA: Systems Engineering Advancement Research Initiative.
SERC 2015-TR-021-4.4 – Volume 4 79
Rios, L. M., & Sahinidis, N. V. (2009). Derivative-Free Optimization: A Review of Algorithms and Comparison of Software Implementations. Advances in Optimization II.
Rosenau, W. (1991). Coalition Scud Hunting in Iraq, 1991. RAND Corporation.
Ross, A. M., Rhodes, D. H., & Hastings, D. E. (2008). Defining Changeability: Reconciling Flexibility, Adaptability, Scalability, Modifiability, and Robustness for Maintaining System Lifecycle Value. Systems Engineering, 246 - 262.
Ross, S. (2003). Introduction to Probability Models, 8th Edition. Academic Press.
Rostker, B. (2000, July 25). Iraq's Scud Ballistic Missiles. Retrieved Sep 12, 2013, from Iraq Watch: http://www.iraqwatch.org/government/US/Pentagon/dodscud.htm
Runarsson, T. P., & Yao, X. (2000). Stochastic Ranking for Constrained Evolutionary Optimization. IEEE Transactions on Evolutionary Computation, 4(3), 284-294.
Russell, S., & Norvig, P. (2009). Artificial Intelligence: A Modern Approach. Prentice Hall.
Sanz, J. A., Fernandez, A., Bustince, H., & Herrara, F. (2013). IVTURS: A Linguistic Fuzzy Rule-Based Classification System Based On a New Interval-Valued Fuzzy Reasoning Method with Tuniing and Rule Selection. IEEE Transactions on Fuzzy Systems, 21(3), 399-411.
Schreiner, M. W., & Wirthlin, J. R. (2012). Challenges using modeling and simulation in architecture. Procedia Computer Science, 8, 153-158.
Schwartz, M. (2010). Defense acquisitions: How DoD acquires weapon systems and recent efforts to reform the process. Washington DC: LIBRARY OF CONGRESS CONGRESSIONAL RESEARCH SERVICE.
Selva, D., & Crawley, E. F. (2013). VASSAR: Value Assessment of System Architectures using Rules. IEEE Aerospace Conference (pp. 1-21). Big Sky MT: IEEE.
Shi, X., & Xia, H. S. (2001). Model and Interactive Algorithm of Bi-Level Multi-Objective Decision-Making with Multiple Interconnected Decision Makers. Journal of Multi-criteria Decision Analysis, 10, 27-34.
Shtub, A., Bard, J. F., & Globerson, S. (1994). Project Management. New Jersey: Prentice Hall.
Simpson , T. W., & D’souza, B. S. (2002). Assessing Variable Levels of Platform Commonality within a Product Family Using a Multiobjective Genetic Algorithm. 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization. Atlanta, GA.
Singer, Y. (2006). Dynamic Measure of Network Robustness. IEEE 24th Conference of Electrical and Electronic Engineers in Israel.
Singh, A., & Dagli, C. H. (2009). Multi-objective Stochastic Heuristic Method for Trade Space Exploration of a Network Centric System of Systems. 3rd Annual IEEE International Systems Conference, 2009. Vancouver, Canada.
Singh, A., & Dagli, C. H. (2010). "Computing with words" to support multi-criteria decision-making during conceptual design. Systems Research Forum, 85-99.
Smartt, C., & Ferreira, S. (2012). Constructing a General Framework for Systems Engineering Strategy. Systems Engineering, 15(2), 140-152.
Smartt, C., & Ferreira, S. (2012). Constructing a General Framework for Systems Engineering Strategy. Systems Engineering, 15(2), 140-152.
SERC 2015-TR-021-4.4 – Volume 4 80
SPEC Innovations. (n.d.). Model-Based Systems Engineering Tools. Retrieved August 20, 2014, from https://www.innoslate.com/systems-engineering/
Suarez, R. (2004, Dec 9). Troops Question Secretary of Defense Donald Rumsfeld about Armor. Retrieved Apr 14, 2014, from PBS NewsHour: http://www.pbs.org/newshour/bb/military-july-dec04-armor_12-9/
Sumathi, S., & Surekha, P. (2010). Computational Intelligence Paradigms: Theory & Applications Using MATLAB. Boca Raton FL: CRC Press.
Talbot, F. B. (1982). Resource-constrained project scheduling with time-resource tradeoffs: The nonpreemptive case. Management Science, 28(10), 1197-1210.
Taleb, N. N. (2004). Fooled by Randomness. New York: Random House Trade Paperbacks.
Thompson, M. (2002, December 23). Iraq: The Great Scud Hunt. Time Magazine, pp. http://www.time.com/time/magazine/article/0,9171,1003916,00.html.
Trans-Atlantic Research and Education Agenda in Systems of Systems (T-AREA-SOS) Project. (2013). The Systems of Systems Engineering Strategic Research Agenda. Loughborough: Loughborough University.
Wall, M. B. (1996). A genetic algorithm for resource-constrained scheduling. Cambridge, MA: Doctoral dissertation, Massachusetts Institute of Technology.
Wang, J.-Q., & Zhang, H.-Y. (2013). Multicriteria Decision-Making Approach Based on Atanassov's Intuitionistic Fuzzy Sets With Incomplete Certain Information on Weights. IEEE Transactions on Fuzzy Systems, 21(3), 510-515.
Wang, R., & Dagli, C. H. (2011). Executable system architecting using systems modeling language in conjunction with colored Petri nets in a model‐driven systems development process. Systems Engineering, 14(4), 383-409.
Wang, R., Agarwal, S., & Dagli, C. (2014). Executable System of Systems Architecture Using OPM in Conjunction with Colored Petri Net: A Module for Flexible Intelligent & Learning Architectures for System of Systems. Europe Middle East & Africa Systems Engineering Conference (EMEASEC). Somerset-West, Western Cape, South Africa.
Wang, Y., Cai, Z., Guo, G., & Zhou, Y. (200). Multiobjective Optimization and Hybrid Evolutionary Algorithm to Solve Constrained Optimization Problems. IEEE Transactions Systems, Man, and Cybernetics- Part B: Cybernetics, 34(3), 560-575.
Wanyama , T., & Far, B. H. (2007). A Protocol for Multi-agent Negotiation in a Group choice Decision Making Process. Journal of Network and Computer Applications, 30, 1173-1195.
Wappler, S. (2007). Automatic Generation of Object-Oriented Unit Tests Using Genetic Programming. Berlin: Universitätsbibliothek der Technischen Universität.
Wappler, S., & Wegener, J. (2006). Evolutionary Unit Testing of Object-Oriented Software Using Strongly-Typed Genetic Programming. Proceedings of the 8th annual conference on Genetic and evolutionary computation, (pp. 1925–1932).
Warfield, J. N. (1973). Binary Matrices in System Modeling. IEEE Transactions on Systems, Man, and Cybernetics, SMC-3(No. 5, September), 441-449.
Weiss, W. (2008). Dynamic Security: An Agent-based Model for Airport Defense. Proceedings of the Winter Simulation Conference.
Woldesenbet, Y. G., Yen, G. G., & Tessema, B. G. (2009). Constraint Handling in Multiobjective Evolutionary Optimization. IEEE Transactions on Evolutionary Computation, 13(3), 514-525.
SERC 2015-TR-021-4.4 – Volume 4 81
Yi, J. S., Kang, Y., Stasko, J. T., & Jacko, J. A. (2007, Nov/Dec). Toward a Deeper Understanding of the Role of Interaction in Information Visualization. IEEE Transactions on Visualization and Computer Graphics, 13(6), 1224 - 1231.
Yu, O.-Y., Gulkema, S. D., Briaud, J.-L., & Burnett, D. (2011). Sensitivity Analysis for Multi-Attribute System Selection Problems in Onshore Environmentally Friendly Drilling (EFD). Systems Engineering, 15(2), 153-171.
Yu, T.-L., Yassine, A. A., & Goldberg, D. E. (2007). An information theoretic method for developing modular architectures using genetic algorithms. Research in Engineering Design, 18(2), 91-109.
Zadeh, L. A. (1975). Fuzzy logic and approximate reasoning. Synthese, 30(3-4), 407-428.
Zhan , G., Lu, J., & Ya, G. (2008). An algorithm for fuzzy multi-objective multi-follower partial cooperative bilevel programming. Journal of Intelligent & Fuzzy Systems, 19, 303–319.
Zhan, G., Lu, J., & Ya, G. (2008). Fuzzy bilevel programming: multi-objective and multi-follower with shared variables. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 16, 105-133.
Zhang, G., & Lu, J. (2010). Fuzzy bilevel programming with multiple objectives and cooperative multiple followers. Global Optimization, 47, 403-419.
Zhang, T., Hu, T., Jia-wei, C., wang, Z., & Guo, X. (2012). Solving Bilevel Multiobjective Programming Problem by Elite Quantum Behaved Particle Swarm Optimization. Journal of Applied Mathematics, 2012. doi:10.1155/2012/102482
Zhang, T., Hu, T., Zheng, Y., & Guo, X. (2012). An Improved Particle Swarm Optimiza-tion for Solving Bilevel Multiobjective Programming Problem. Journal of Applied Mathematics,, 2012. doi:10.1155/2012/626717
Zheng, Y., Wan, Z., & Wang, G. (2011). A fuzzy interactive method for a class of bilevel multiobjective programming problem. Expert Systems with Applications, 38, 10384–10388.
Zhou, A., Qu, B.-Y., Li, H., Zhao, S.-Z., Suganthan, P. N., & Zhang, Q. (2011). Multiobjective Evolutionary Algorithms: A Survey of the State of the Art. Swarm and Evolutionary Computation(1), 32-49.
SERC 2015-TR-021-4.4 – Volume 4 82