lifecycle modeling language tutorial by dr. dam and dr. vaneman
DESCRIPTION
Dr. Steve Dam and Dr. Vaneman present "A New Open Standard: Lifecycle Modeling Language (LML) a Language for Simple, Rapid Development, Operations and Support" at the Systems Engineering D.C. Conference (SEDC), April 2014.TRANSCRIPT
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
A New Open Standard: Lifecycle Modeling
Language (LML) a Language for Simple,
Rapid Development, Operations and
Support
April 2014
1
Steven H. Dam, Ph.D., ESEPPresident, SPEC [email protected]
Warren K. Vaneman, Ph.D.Naval Postgraduate [email protected]
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Overview
Why a New Language?
Lifecycle Modeling Language Overview
Break (10 minutes)
LML Visualizations
LML Tool Needso NASA “Requirements”
o Implementation of LML
Break (10 minutes)
Sample Problem Demonstrationo Requirements Analysis
o Functional Analysis
o Synthesizing Solutions
o Verification
o Project Management
o Creating Reports
Summary & Questions 2
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Why a New Language?
3
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Industry Development and Endurance Timelines
• Industry has embraced the benefits of global technology.
• Development time has decreased.
• Perishability/obsolescence increasing
4
Current Environment
Adversary Development and Counter-Measure Timelines
• Adversaries are also leveraging COTS technology.
• Emerging threats demand we enhance the speed in which we deliver new capabilities.
SOURCE: Systems 2020 Study Final Report, Booz Allen Hamilton, August 2010
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
DoD Development and Endurance
Timelines
• Long development times
• Good for fixed threats.
• However, against asymmetric
threats perishability/obsolescence
increasing.
5
Current Environment
U.S. is Faced with Highly
Unpredictable, but Surmountable
Threats
• Acquisition methods designed for
warfare during the Cold War are
ill-suited for the dynamic nature of
threats today.
SOURCE: Systems 2020 Study Final Report, Booz Allen Hamilton, August 2010
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Complex Systems Implications for
Systems Engineering
• Larger, more complex systems development
implies:
– A need for larger, distributed teams
– A need for a clear concise way to express the system
design (clear, logically consistent semantics)
– New tools to enable collaboration across the entire
lifecycle
6
Complexity has been identified by many as a critical problem facing system engineers
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Complexity
• With the growth of the
Internet and daily
changes in IT, systems
have become more
complex and change
more rapid than ever
before
• Systems engineering
methods have not kept
up with these changes
• SE has been relegated
to the beginning of the
lifecycle
7
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
How Does SE Typically Respond to
Complexity?
• Focus on upper-left hand side of the “Vee”
• Systems Architecture and Requirements
Development
– More complex MBSE languages
– More complex processes and governance
• More layers of abstraction
– “Systems of Systems”
– “Family of Systems”
– “Portfolio Management”
• Need more time and money!
8
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
More Money Is a Problem
• Calls for doing more
with less continue
• Need for lower labor
and tool costs
essential for
acceptance of SE
across the lifecycle
9
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
How can we simplify things enable “quicker/ cheaper”? Let’s start with the language we use
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Overlap Among MBSE Techniques
10
SOURCE: Grady. J.O. (2009). Universal Architecture Description Framework, INCOSE.
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
State of Current “Languages”
• In the past decade, the Unified Modeling
Language (UML) and the Systems Modeling
Language (SysML) have dominated the
discussion
• Why?
– Perception that software is “the problem”
– Hence need for an “object” approach
• SysML was designed to relate systems thinking
to software development, thus improving
communication between systems engineers
(SE) and software developers11
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Why Objects Are Not the Answer
• Although SysML may improve the communication of design between SEs and the software developers it does not communicate well to anyone else
– No other discipline in the lifecycle uses object oriented design and analysis extensively
– Users in particular have little interest/acceptance of this technique
– Software developers who have adopted Agile programming techniques want functional requirements (and resent SEs trying to write software)
– Many software languages are hybrid object and functional
12
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
So What Do We Do?
• Recognize that our
primary job as SEs is
to communicate
between all
stakeholders in the
lifecycle
• Be prepared to
translate between all
the disciplines
• Reduce complexity in
our language to
facilitate
communication
13
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Lifecycle Modeling
Language (LML) Overview
14
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Lifecycle Modeling Language
(LML)
• LML combines the logical constructs with an
ontology to capture information
– SysML – mainly constructs – limited ontology
– DoDAF Metamodel 2.0 (DM2) ontology only
• LML simplifies both the “constructs” and
“ontology” to make them more complete, yet
easier to use
15
Goal: A language that works across the full
lifecycle
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Models
Primary Entities• Asset/Resource• Connection
Primary Entities• Action• Input/Output
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Ontology* Overview
• Taxonomy**:
– 12 primary entity classes
– Many types of each entity class
• Action (types = Function, Activity, Task, etc.)
• Relationships: almost all classes related to each
other and themselves with consistent words
– Asset performs Action/Action performed by Asset
– Hierarchies: decomposed by/decomposes
– Peer-to-Peer: related to/relates
17
*Ontology = Taxonomy + relationships among terms and concepts** Taxonomy = Collection of standardized, defined terms or concepts
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Schema Elements
• Action
• Artifact
• Asset
– Resource
• Characteristic
– Measure
• Connection
– Logical
– Conduit
• Cost
• Input/Output
• Location
– Physical
– Orbital
– Virtual
• Risk
• Software Interface
• Statement
– Requirement
– Decision
• Time
18
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Primary Entities and Relationships
19
Artifact decomposed by/
decomposes
Statement (Requirement)
Characteristic (Measure)
source of/sourced by
Action
traced from/traced to
Asset (Resources)
performed by/performs
Input/Output
specified by/specifies
Connection (Conduit)
transferred by/transfers
connected by/
connects
generated by/
generates
received by/
receives
decomposed by/
decomposes
decomposed by/
decomposes
decomposed by/
decomposes
decomposed by/
decomposes
decomposed by/
decomposes
decomposed by/
decomposes
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Action ArtifactAsset
(Resource)
Characteristic
(Measure)
Connection
(Conduit,
Logical)
Cost Decision Input/Output
Location
(Orbital,
Physical,
Virtual)
RiskStatement
(Requirement)Time
Actiondecomposed by*
related to*references
(consumes)
performed by
(produces)
(seizes)
specified by - incursenables
results in
generates
receiveslocated at
causes
mitigates
resolves
(satisfies)
traced from
(verifies)
occurs
Artifact referenced bydecomposed by*
related to*referenced by
referenced by
specified by
defines protocol for
referenced by
incurs
referenced by
enables
referenced by
results in
referenced by located at
causes
mitigates
referenced by
resolves
referenced by
(satisfies)
source of
traced from
(verifies)
occurs
Asset
(Resource)
(consumed by)
performs
(produced by)
(seized by)
references
decomposed by*
orbited by*
related to*
specified by connected by incurs
enables
made
responds to
results in
- located at
causes
mitigates
resolves
(satisfies)
traced from
(verifies)
occurs
Characteristic
(Measure)specifies
references
specifiesspecifies
decomposed by*
related to*
specified by*
specifiesincurs
specifies
enables
results in
specifies
specifieslocated at
specifies
causes
mitigates
resolves
specifies
(satisfies)
spacifies
traced from
(verifies)
occurs
specifies
Connection
(Conduit,
Logical)
-defined protocol by
referencesconnects to specified by
decomposed by*
joined by*
related to*
incursenables
results intransfers located at
causes
mitigates
resolves
(satisfies)
traced from
(verifies)
occurs
Cost incurred byincurred by
referencesincurred by
incurred by
specified byincurred by
decomposed by*
related to*
enables
incurred by
results in
incurred by located at
causes
incurred by
mitigates
resolves
incurred by
(satisfies)
traced from
(verifies)
occurs
Decisionenabled by
result of
enabled by
references
result of
enabled by
made by
responded by
result of
enabled by
result of
specified by
enabled by
result of
enabled by
incurs
result of
decomposed by*
related to*
enabled by
result oflocated at
causes
enabled by
mitigated by
result of
resolves
alternative
enabled by
traced from
result of
date resolved by
decision due
occurs
Input/Outputgenerated by
received byreferences - specified by transferred by incurs
enables
results in
decomposed by*
related to*located at
causes
mitigates
resolves
(satisfies)
traced from
(verifies)
occurs
Location
(Orbital,
Physical,
Logical)
locates locates locateslocates
specified bylocates locates locates locates
decomposed by*
related to*
locates
mitigates
locates
(satisfies)
traced from
(verifies)
occurs
Riskcaused by
mitigated by
resolved by
caused by
mitigated by
references
resolved by
caused by
mitigated by
resolved by
caused by
mitigated by
resolved by
specified by
caused by
mitigated by
resolved by
caused by
incurs
mitigated by
resolved by
caused by
enables
mitigated by
results in
resolved by
caused by
mitigated by
resolved by
located at
mitigated by
caused by*
decomposed by*
related to*
resolved by*
caused by
mitigated by
resolved by
occurs
mitigated by
Statement
(Requirement)
(satisfied by)
traced to
(verified by)
references
(satisified by)
sourced by
traced to
(verified by)
(satisified by)
traced to
(verified by)
(satisified by)
specified by
traced to
(verified by)
(satisified by)
traced to
(verified by)
incurs
(satisified by)
traced to
(verified by)
alternative of
enables
traced to
results in
(satisified by)
traced to
(verified by)
located at
(satisfied by)
traced to
(verified by)
causes
mitigates
resolves
decomposed by*
traced to*
related to*
occurs
(satisified by)
(verified by)
Time occurred by occurred by occurred byoccurred by
specified byoccurred by occurred by
date resolves
decided by
occurred by
occurred by occurred byoccurred by
mitigates
occurred by
(satisfies)
(verifies)
decomposed by*
related to*
LML Relationships Provide Linkage Needed Between the Classes
• decomposed by/decomposes
• orbited by/orbits• related to/relates
20
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Translation
• Two types of mapping for tailoring:
– Map names of classes to enable other “schema”
models to be used
– Map symbols used (e.g., change from LML Logic to
Electrical Engineering symbols)
– Enable diagram translations (e.g, Action Diagram to
IDEF 0)
LML Symbol
ElectricalEngineering
BPMN …
AN
D
LML Class DM2 SysML …
Action Activity Activity
Asset Performer Actor
21
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Example: Translation to DM2
22
DoDAF V 2.0 PDF p 27
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
DM2 Conceptual Model to LML
Schema Mapping
DM2 Schema Element (Conceptual) LML Equivalent
Activity Action
Capability Action with “Capability” type
Condition Characteristic with “Condition” type
Information/Data Input/Output
Desired Effect Statement with “Desired Effect” type
Guidance Statement with “Guidance” type
Measure Measure
Measure Type Measure types
Location Location
Project Action with “Project” type
Resource Asset with types for “Materiel,” “Organization,” etc.
Skill Characteristic with “Skill” type
Vision Statement with “Vision” type
23
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Break
24
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Visualizations
25
Key diagrams needed to better understand the information
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Sequencing
26
No constructs – only special types of Actions – ones that enable the modeling of command and control/ information assurance to capture the critical decisions in your model
Action A Action B
Action A
Action B
Action C
Condition 1
Condition 2
Action A
Action B
LOOP
Action AAction C
Range
Range (e.g.) 1 to n (iterate)
Until r < z (loop)
PARALLEL
SEQUENTIAL
SELECTION
SY
NC
OR
Action C(Exit Criteria)L
OO
P
Coordinated by Asset C
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Action Diagram Captures Functional and Data Flow
27
OR Which path?
Action in Parallel
Action
Start End
Trigger
SY
NCInput/Output 2
Synchronize Information
1.2
1.3
1.7
Action
1.1 Optional Action 2 in Loop
1.6
External Input
External Output
Input/Output 3L
OO
P
Exit Criteria
1.5
Optional Action 1
1.4
Input/Output 1
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Example: Sync Execution Logic
28
ActionControl Line
No trigger: Action enabled and will execute
Action Diagram Timeline
ActionDuration = x sec
0 sec x sec
ActionControl Line
With trigger: Action enabled, but will not
execute until trigger received
ActionDuration = x sec
y sec
Trigge
r
Trigger takes y
sec before
received by Action
wait
0 sec y + x sec
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Execution Logic – Concurrency No Trigger; No Coordination Action
29
Action A
No trigger: Action A enabled and will execute; Asset A performs Action A
Action Diagram Timeline
Action ADuration = x sec
0 sec x sec
sync
Action B
coordinated by Asset A
Duration = y sec
Action B
Start to Start (SS) between A and B
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Execution Logic – Concurrency With Trigger; No Coordination Action
30
Action A
Trigger: Action A enabled, but must wait to execute; Asset A performs Action A
Action Diagram Timeline
Action ADuration = x sec
0 sec y + x sec
sync
Action B
coordinated by Asset A
Duration = y sec
y sec
Action B
Trigger
wait
Finish to Start (FS) between B and A
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Execution Logic – Concurrency No Trigger; With Coordination Action
31
Action A
No trigger: Action C enabled and will execute; Asset A performs Action A and Action C with overlap in time
Action Diagram Timeline
Action C
Duration = x sec
0 sec y + x sec
sync
Action B
coordinated by Asset A
Duration = y sec
y sec
Action BAction C
Action A
Duration = z secz sec
z > y
Finish to Start (FS) between B and A
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Execution Logic – Concurrency No Trigger; With Coordination Action
32
Action A
No trigger: Action C enabled and will execute; Asset A performs Action C and then Action A
Action Diagram Timeline
Action B
Duration = x sec
0 sec z + x sec
sync
Action B
coordinated by Asset A
Duration = y sec
y sec
Action CAction C
Action A
Duration = z secz sec
y > z
Finish to Start (FS) between C and A
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Execution Logic – Concurrency With Trigger; With Coordination Action
33
Action A
No trigger: Action C enabled and will execute; Asset A performs Action A and Action C
Action Diagram Timeline
Action B
Duration = x sec
0 sec y + x sec
sync
Action B
coordinated by Asset A
Duration = y sec
y sec
Action CAction C
Action A
Duration = z secz sec
y > z
Trigger
wait
Finish to Start (FS) between B and A
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Problem with not including sync
34
Request Information
Server
Provide Information
Inte
rfa
ce
Client
Physical View Functional View
Information Request
Information
Deadlock occurs; no IA or C2 functionality
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Problem with not including sync
35
Request Information
Provide Information
Functional View
Information Request
Information
Deadlock Relieved; IA functionality identified
Validate/ Authorize Request
Authorization
coordinated by IA Asset
Sync
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Physical Block Diagram*
Sensor Systems Operator
P.5
.2.1
Asset (Human)
I.1.3 Operator-Sensor Platform InterfaceSensor Platform
P.5
.2.2
Asset (System)
connected with/connects
Sensor System Memory
R.5
.2.2
.1
Asset (Resource)
used by/uses
• capacity (10 Mbits/sec)• Latency (100 millisec)
• maximum quantity (6 Gbytes)• minimum quantity (10 Kbytes)
36
*Note: Work in progress
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Combined Physical Behavior Diagram* Enables Instances and Clones
37
Sensor Systems Operator
P.5
.2.1
Asset (Human)
I.1.3 Operator-Sensor Platform Interface
connected with/connects
Sensor Platform A(2)
P.5
.2.2
Asset (System)
Asset (Resource)
Sensor Platform A(3)
P.5
.2.2
Asset (System)
Asset (Resource)
Sensor Platform A(1)
P.5
.2.2
Asset (System)
Asset (Resource) Clo
nes
pro
vid
e m
ult
iple
inst
ance
s o
f an
Ass
et f
or
use
in s
imu
lati
on
A.3.1
Action (System Function)
A.3.2
Sense Targets
A.3.1
Action (System Function)
A.3.1
Deploy Sensor
input to/input from
input to/input from
Deploy Command
*Note: Work in progress
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Diagrams Are Needed for Every
Class
• Physical Block (Asset) Diagrams– With option for icon
substitution
• Interface Diagrams– N2 (Assets or Actions)
• Hierarchy Diagrams– Automatically color coded
by class
• Time Diagrams– Gantt Charts
– Timeline Diagram
• Location Diagrams– Maps for Earth
– Orbital charts
• Class/Block Definition Diagram– Data modeling
• Risk Chart– Standard risk/opportunity
chart
• Organization Charts– Showing lines of
communication, as well as lines of authority
• Pie/Bar/Line Charts– For cost and performance
• Combined Physical and Functional Diagram
38
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Summary
• LML provides the fundamental foundation for a tool to support SE in the cloud
• LML contains the basic technical and programmatic classes needed for the lifecycle
• LML defines the Action Diagram to enable better definition of logic as functional requirements
• LML uses Physical Diagram to provide for abstraction, instances, and clones, thus simplifying physical models
• LML provides the “80% solution”
– It can be extended to meet specific needs (e.g. adding Question and Answer classes for a survey tool that feeds information into the modeling)
39
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Break
40
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Tool Needs
41
Scalability and collaboration across the lifecycle are essential
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Tool Needs
• At a high level, we know that any tool must scale to accommodate a very large data set
– We are trying to capture information about the systems of systems being developed, their interactions, decisions, risks, cost, and many other parameters throughout the lifecycle
• Also, collaboration is critical as a large data set means many, many people involved, who use different terminology and perhaps even different languages – worldwide
42
Fortunately we have a new environment for our LML tools – Cloud Computing
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
NASA “Requirements”
43
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
44
Tool for Vision for MBSE
• Tool must be designed as a MBSE tool using Lifecycle Modeling Language (LML)– All artifacts must be
produced from the tool using standard reports
• Tool must enable both seamless data flow and distributive, collaborative teams working on the same model
• Implies need for ability to scale “infinitely” on demand
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
45
Tool for Complexity
• Tool must be designed for use throughout the lifecycle, thus enabling end-to-end systems engineering– Legacy systems are a key
part of any modeling environment – LML has specific information classes for capturing designs at different times in one knowledgebase
• Tool must embed simulation, which enables real time simulations as well as the ability to scale “infinitely” to hundreds of server instances
• Tool must also reduce complexity by use of special input screens, which enhance configuration management as well
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
46
Tool for Multi-Decadal/Generation
• Tool must enable tracking of performance over time as Characteristics of the Assets
• Environments and conditions can also be captured as evolved over time and location (including orbital locations)
• Tool must evolve of the lifecycle of systems, but the data must be captured and reused throughout the lifecycle
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
47
Tool for Uses and Challenges
• Tool needs to create a CONOPS report based on scenario modeling
• Capture Issues and Decisions, as well as Cost for cost, schedule and performance tradeoffs
• TRL levels must be easily captured in the tool
• Software as a service provides inexpensive tool, while scalability enables fidelity level required; LML method enables rapid SE design and analysis
• Tool should automate what can be automated to reduce “wetware” requirements and enhance teaming
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved48
Tool for Uses and Challenges (continued)
• Tool should enable design down to detailed level (if desired), as well as V&V support
• Manufacturing enhanced by linking design to CAD/CAM systems (export design information as required)
• Use of XML files to import/export data from other design tools
• Design rationale, assumption and other programmatic information captured as part of Issues, Risk and other program-related elements of LML
• Standards should be captured and developed using the tool; enforcement by tailoring “personal trainer” version
• Operations data and obsolescence part of LML elements (Assets, Characteristics and Time)
• Use of a scenario-based approach enables better information capture from operators
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
49
Tool for Managing Complexity
• Tool in conjunction with a process should capture a reasonable number of scenarios for modeling and simulation, using a test matrix approach (move away from “peeling the onion”)
• Use “test matrix” approach to provide breadth of problem space, including failure modes, to capture the full functionality required
• Test must become even more dependent on simulation
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
50
Tool for Managing Complexity
• Tool should enable sharing of models, thus creating a “library” of component models
• Contributions should be made from academia (tool should be free access to academia) to this library
• Government sponsored research can also be made available to the NASA family via website
From a presentation by Dr. Michael Ryschkewitsch, NASA Chief Engineer, at CSER Conference 15 April 2011
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Response to Chief Engineer
51
From a presentation by Mr. Joe Smith, NASA HQ Program Executive for Systems Engineering, at INCOSE WMA Meeting, 17 September 2014
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
52From a presentation by Mr. Joe Smith, NASA HQ Program Executive for Systems Engineering, at INCOSE WMA Meeting, 17 September 2014
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
NASA “App Store”
53
Another DoDAF MetaModel2.0 (DM2) Physical Exchange Specification (PES)?
From a presentation by Ms Linda Bromley, Manager, Technical Integration Office at NASA Johnson Space Center to AIAA
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
54From a presentation by Ms Linda Bromley, Manager, Technical Integration Office at NASA Johnson Space Center to AIAA
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Organized Functional Tool
Requirements (derived from analysis of Mike R’s presentation)
• Models (MBSE)– Functional
– Physical
– Object
• Simulation– Discrete event
– Monte Carlo
• Data Sharing
• Collaboration– Worldwide access to single
database
– User interaction (e.g., Chat)
• Scalable to large datasets
• Decision/design traceability
• Cost and Schedule analysis support
• Risk analysis support
• Explicit time evolution analysis
• Full lifecycle support– Requirements
– Design
– Acquisition
– Verification
– Operation & Support
– Disposal
• Reports from Models– CONOPS
– DoD or other documents
– Project-specific
• “User Friendly” & Standards– Modern (web-like) UI
– Desktop Support (PC, Mac, Linux)
– Tablet Support (iPad, Android)
– Enforces standards via rules and checkers
– Embedded training
• “Reasonable” tool cost
55
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
IMPLEMENTATION OF LML
Innoslate® is the first tool to implement LML completely
56
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Implementation of LML
• Prior to development of LML, a schema was developed called KBAD and implemented in Vitech’s CORE tool
• We used this for a number of years to get many of the rough edges out of the schema
• As such, LML’s ontology can easily be implemented in a number of tools
• The diagrams, particularly the Action Diagram, must be implemented by tool vendors
• Innoslate® is the first of these LML tools
57
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
What is Innoslate?
• A tool to capture requirements, design, operations, and support information using systems engineering principles
• Enables requirements analysis and management with direct linkages to design models
• Applies cloud computing technology to model-based systems engineering techniques and discrete event simulation
• Available on the public cloud, private clouds, and client-server platforms
• Implements the lifecycle modeling language (LML) and enables translation to UML, SysML, DoDAF(DM2), and other languages
58
Innoslate® provides software as a service (SaaS)
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
What are the system
requirements?
• Operating system
– Any (Windows XP/7, MAC OS X, Linux)
• Devices
– Any (PC, Mac, iPad, Android Tablet)
• Software
– Any modern web browser (Google Chrome, Firefox 5+, Safari, limited support for IE 9+ or IE 7+ with Google Chrome Frames)
• No downloads required
59
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
What Can Innoslate® Produce?
• Tool Capabilities– Early Simulation/Design Validation
– End-to-End Design Management and Collaboration
– Repository of Analyses from Any Tool
– Traceability from Requirements to Functions to Components to Test to Operations to Support to Disposal
• Version 2.4 Reports– Entity Reports for each class
– Requirements Report
– Concepts of Operations (CONOPS)*
– JCIDS Documents (ICD, CDD, CPD, DCR)*
– DoDAF and MODAF products
– Test and Evaluation Plan/Report
• Reports slated for later versions– Specifications for detailed design (in the languages of the design
engineers)
– Processes and Procedures for Operations and Training
60*These complex reports have special input wizards and user guides
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Innoslate Supports LML’s
Simplified Schema
• Action
• Artifact
• Asset
– Resource
• Characteristic
– Measure
• Connector
• Cost
• Decision
• Input/Output
• Location
– Physical, Orbital, Virtual
• Risk
• Statement
– Requirement
• Time
61
Integrated data analysis of cost, schedule,
and performance for the lifecycle
Capture verification
and validation data
Conduct logical
decomposition and
analysis
Capture
requirements with
quality measures
Capture key
stakeholder
decisions
Designed with
space in mind
Simplified
schema reduces
start-up/training
time
Conduct physical
decomposition and
analysis
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Requirements
62
Requirements quality checks with definitions
Trace to higher level requirements
Overall quality score based on the summation of individual quality checks
True requirements analysis capability,
not just management
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Requirements View
63
Color bars to show baselining status
Labels Column
Quality Score Column
Capability to collapse child list
Document based requirements
Add other columns as needed
Enhanced requirements management
capability, too
Automated Quality Check
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Database View
64
Labels, not folders, for organizing information, search and providing ready access
Allows multiple labels for the same element!
Share database worldwide with colleagues (Read/Write) and reviewers (Read Only with Free version)
Enable user interaction through built-in ChatDesigned to scale to large
data sets
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Entity View
65
Every element can have it’s own picture, which can be used in the reports
Tabs for logical grouping of relationships reduces information overload
Attribute definitions built in to the user interface provide immediate help for new users
Detailed history of changes available for each element
Comments can be added by any user sharing the database, including free users enabling model-based reviews
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Action Modeling (Functional View)
66
Logic design entities (special cases of actions) can be dragged and dropped on the diagram to model processes or scenarios (new and existing)
Entities can be moved around the diagram as needed
Input/output entities can be added easily by dragging them to the diagram or between actions on the diagram
Action Diagrams for functional modeling designed
to work on tablets, such as the iPad
Side bar provides access to entity attributes for modification within the view
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Asset Modeling (Physical View)
67
Asset Diagrams describe
physical components
and interfaces
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Discrete Event Simulation
68
Gantt format for time output
View by Dates or Duration
View details and charts for cost analysis
Results saved automatically as an Artifact;
Monte Carlo available in Professional Edition
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Location
69
Use Select Point button to provide the longitude and latitude using Google Maps on a Physical Location
Capture parameters that describe the orbit of space-based assets
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Report Generation Wizards
70
Wizards walk users through documents guiding
their inputs, also reducing need for training
Uses Applied Space Systems Engineering (ASSE) outline; tailorable by using “Add CONOPS Document Section”
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Schema Extension
71
Add new classes, relationships and
attributes as you need them
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Visual Representations
• Version 2.4 Diagrams– Hierarchy charts
– Action Diagram
– Asset Diagram (Interfaces)
– N2 Chart (I/O + Actions)
– Risk charts
– Radar Diagram
– Sequence (Experimental) [Object Modeling]
– Spider Diagram (All, Custom, Traceability)
– Timelines (Experimental)
– Location (maps)
– IDEF0/ICOM
• Planned for future releases– Other location views, including orbital
– Class, and other UML/SysML diagrams [Object Modeling]
72
For Risk Analysis Support
For Decision/Design Traceability
For additional time evolution analysis and presentation
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Innoslate Features and Benefits
Features Benefits
Chat/Real-time Notification Enables collaboration worldwide
Lifecycle Modeling Language (LML) schema and Web User Interface
Provides simplicity and ease of use with little or no training
Open feature requests with voting on priority by users (Feature Tracker)
Supports transparency between users and developers
Google App Engine’s cloud computing capability
Can scale design to deal with very large projects that include millions of objects
Innoslate® security layer with Google App Engine security layer and SSL
Provides secure environment for data at rest or use Client-Server version
Share models and parsed documents via Sharespace
Reduces rework of commonly used documents and models
Automatic upgrades; no installation; runs on any computer or device (e.g., iPad)
Reduces IT support costs and trouble significantly
Responsive to user requests Provides new features for users when they need them
Monthly payment plan Buy what you need, when you need it
73
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
For a Complete Comparison
• Tools that Innoslate can replace
– Requirements Management (e.g., DOORS)
– Modeling (e.g., CRADLE, CORE)
– Simulation (CORESim, ARENA)
– File Sharing (e.g., Sharepoint, Windchill)
– Risk Analysis and Management
– Project Planning (e.g., MS Project)
• Be sure to compare combined cost, as well as features
74
Break
75
Sample ProblemLet’s look at an example of using Innoslate® through the lifecycle
76
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
To Further Our Understanding
• We will use a sample problem called FireSAT– “FireSAT is arguably the most famous space mission
that never flew. The idea for this fictitious mission was first developed in Space Mission Analysis and Design (SMAD) [Larson and Wertz, 1999].”
– See ASSE Section 1.6 and Chapter 9 for more details
• Task for the ASSE book: “Develop the mission concept for an on-orbit fire detection system.”
• As we go through the lifecycle, we will capture data entities using this problem
• As we may not have the subject matter experts on this, it will identify a real-world occurrence – we might have to build a team to perform this analysis
77
What name do we want to give this service?
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
What is Middle-Out? It Begins with the Lifecycle
ArchitectureDevelopment
System Design
Hardware/Software Acquisition
Integration and Test
Operational T&E and Transition
Future Operations and Support
Demolition and Disposal
Program Management
Current Operations and Support
78
Requirements Analysis
Functional Analysis and Allocation
Synthesis
System Analysis and Control
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
System Design Process
79
Requirements Analysis
Functional Analysis and Allocation
Synthesis
System Analysis and Control
Best Use:
“Classical SE”
Best Use: Reverse
Engineering (As-Is)
Best Use:
Architecture
Development
(To-Be)
Adapted from EIA-632
The middle-out
approach has been
proven on a variety
of projects
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
14. Provide Options
80
System Design Process Timeline
5. Develop the Operational Context Diagram
15. Conduct Trade-off Analyses
6. Develop Operational Scenarios
1. Capture and Analyze Related Artifacts
4. Capture Constraints
3. Identify Existing/Planned Systems
2. Identify Assumptions
7. Derive Functional Behavior
8. Derive Assets
10. Prepare Interface Diagrams
12. Perform Dynamic Analysis
11. Define Resources, Error Detection & Recovery
13. Develop Operational Demonstration Master Plan
16. Generate Operational and System Architecture Graphics, Briefings and Reports
Requirements Analysis
Functional Analysis
Synthesis
System Analysis and Control
This implementation of the middle-out approach has been proven on a variety of architecture projects
9. Allocate Actions to Assets
Time
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Requirements Analysis
81
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Requirements Analysis
• Step 1: Import “requirements” (ASSE goals and objectives)
• Step 2: Identify any needed critical decisions (issues)
82
Source
Documents
External
Interface
Database
User Needs
Decompose Statements
Critical
Issue?
Statement
Verifiable?
Determine Options and
Perform Trade Studies
See System
Analysis and
Control for details
Resolve Issues with
Customer
YES
NO
Coordinate Changes to
Make Statement Verifiable
NO
Review Statements and
Risks with Customer
Update Knowledgebase
YES
Identify Risks and Plan
Mitigation
Updated
Requirements
Traceability Matrix
Preliminary
Test
Requirements
Standards
Selected
Change
Requests
Architecture
Knowledgebase
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Requirements Analysis
• Step 3: Analyze
requirements
quality
(including
verification)
• Step 4: Identify
any risks
83
Source
Documents
External
Interface
Database
User Needs
Decompose Statements
Critical
Issue?
Statement
Verifiable?
Determine Options and
Perform Trade Studies
See System
Analysis and
Control for details
Resolve Issues with
Customer
YES
NO
Coordinate Changes to
Make Statement Verifiable
NO
Review Statements and
Risks with Customer
Update Knowledgebase
YES
Identify Risks and Plan
Mitigation
Updated
Requirements
Traceability Matrix
Preliminary
Test
Requirements
Standards
Selected
Change
Requests
Architecture
Knowledgebase
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Functional Analysis
84
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Functional Analysis
• Step 1: Create
Context
Diagram
• Step 2: Develop
Scenarios
• Step 3: Build
scenario model
85
Architecture
Knowledgebase
Develop/Revise Context
Diagram
Determine Options and
Perform Trade Studies
See System Analysis and
Control for details
Review Model and Risks with
Customer
Identify Risks and Plan
Mitigation
Updated
Architecture
Knowledgebase
Develop Series of Scenarios
for Analysis
Create/Update System
Behavior Model
Analyze Behavior Model
Performance
Behavior Model• Control Flow
• Data Flow (Activity Model)
• Performance Criteria
Allocate Actions to Assets and
Input/Outputs to Links
Updated
Architecture
Knowledgebase
Detailed
Operational
Concept
Operational
Requirements
Document
(ORD)
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Functional Analysis
• Step 4: Execute
scenario
• Step 5: Identify
risks
• Step 6: Allocate
Actions to initial
Assets
86
Architecture
Knowledgebase
Develop/Revise Context
Diagram
Determine Options and
Perform Trade Studies
See System Analysis and
Control for details
Review Model and Risks with
Customer
Identify Risks and Plan
Mitigation
Updated
Architecture
Knowledgebase
Develop Series of Scenarios
for Analysis
Create/Update System
Behavior Model
Analyze Behavior Model
Performance
Behavior Model• Control Flow
• Data Flow (Activity Model)
• Performance Criteria
Allocate Actions to Assets and
Input/Outputs to Links
Updated
Architecture
Knowledgebase
Detailed
Operational
Concept
Operational
Requirements
Document
(ORD)
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Synthesizing Solutions
87
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Synthesis
• Step 1: Identify Component Assets
• Step 2: Explore options
• Step 3: Allocate Component Assets to Actions
88
Architecture
Knowledgebase
Identify Component
Assets
Optional
Assets?
Determine Options and
Perform Trade StudiesSee System
Analysis and
Control for details
Select New Assets in
Coordination with
Customer
YES
NO
Review Design and
Risks with Customer
Identify Risks and Plan
Mitigation
Technology
Insertion
Recommend-
ations
Allocate Assets
Updated
Architecture
Knowledgebase
Functional
Requirements
Document (FRD)
Design Diagrams
See Functional
Analysis and
Control for details
Transition Plan
Link to
programs
Programmatic
Roadmap
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Verification
Still under development
89
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Coming Up the Vee
I&V Planning
Integration
Verification
Des
ign
Mo
nit
ori
ng
90
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Verification
• Step 1: Create Test Plan
• Step 2: Create Test Case
• Step 3: Create Verification Requirement
• Step 4: Allocate Verification
Requirement to Requirement
• Step 5: Identify MOEs and MOPs
91
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
1. Introduction1.1 Purpose1.2 Mission Description1.3 Architecture Overview
2. Test Strategy2.1 Test Readiness Review
2.2 Test Verification Matrix2.3 Testing Environment2.4 Testing Approach2.5 Test Data2.6 Test Equipment and Facilities2.7 Configuration Management2.8 Test Cases
3. Test Management, Schedule, and Processes3.1 Roles, Responsibilities, and Resources3.2 Testing Schedule3.3 Test Processes3.4 Metrics and Reporting3.4.1 Test Case Approval Process3.4.2 Test Deliverables and Post Test Activities
A. AppendixesA.1 Requirements Verification Matrix (RVTM)
Test Plan
• Outline generated
by wizard as
statement entities
• Wizard can be
used to build test
cases and
associate them
with the test plan
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Test Cases
Test CaseNumber:
VT.1 TestTitle:
FireSAT Developmental Test TesterName:
V&V Organization Pass/Fail:
RevisionNumber:
TestProctor:
NASA TestDate:
01 December 2014 Fail
Estimated Execution Time: 10.0 hours Actual Execution Time:
Approval Signature: Approval Date:
Test Objective: Demonstrate that FireSAT system meets technical objectives.
Test Setup: Laboratory environment
Related Requirementsor Configuration Change Requests(Number/Description):
No. Test Action Expected ResultsPass/ Fail
Actual Results / CommentsCR/PR/DR
No.
1
2
3
4 Setup of equipment includes stimuli to emulate a fire from space.
Fire detection within objective measures will occur.
Pass Test met 90% of objective value, well below the threshold value.
5
Test Cases become part of test plan or can be developed separately
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Test Verification Matrix (TVM)
Test CaseVerification Requirement
Number and Name Description Criteria
VT.1 FireSAT Developmental Test SRD.3.2.2 Spectral Resolution
The space vehicle shall detect wildfire emissions in the wavelength range of 4.2 plus or minus 0.2 micrometers. Rationale: This requirement comes from the FireSAT wildfire definition trade study and corresponds to a wildfire temperature of 690K. Optimum wavelength for a fire at 1000K would be 2.9 micrometers; however, there is no atmospheric window at that wavelength so 4.2 micrometers represents a good compromise.
Meets detection temperatures below threshold values.
TABLE: TEST VERIFICATION MATRIX (TVM)
The TVM shows the linkage between the test cases and the verification requirements
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Project Management
95
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Innoslate® Support for Project
Management
• Process Modeling (Action Diagram)
• Tracking (Action, Cost, and Decision
classes; simulation)
• Risk Analysis (Risk class and diagram)
• Timeline chart
96
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Creating Reports
97
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
Reports in Innoslate®
• Reports for every class
• General reports
– Comments, Entity Table, etc.
• Complex reports with wizards
– CONOPS, Requirements Document
• Specialty reports
– JCIDS, DoDAF, MODAF
98
SUMMARY
99
© 2014 Lifecycle Modeling Steering Committee. All Rights Reserved
LML Bottom-Line
• LML provide a simplified ontology that
should meet the needs of most projects
with little or no extensions
• The cloud provides a new capability to
deal with complex problems
• Innoslate® was designed to implement
LML and is not expected to be the only
tool that does so
100