software challenges and keys to success · metrics driven processes technical expertise coupled...
Post on 14-Aug-2020
5 Views
Preview:
TRANSCRIPT
Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software Challenges
and
Keys to Success
Sept 2016
Joe HeilLead of the Naval Software Community of Practice (SW COP)
Lead of the Naval Syscom System Engineering Stakeholders Group Software Working Group (SESG SW-WG)
Chief Engineer and Principal Software Engineer for the Strategic and Computing Systems Dept
Naval Surface Warfare Center Dahlgren Division (NSWCDD)
540-653-1937
joe.heil@navy.mil
2Unclassified Distribution A: Approved for public release; distribution is unlimited.
Presenter Bio & Presentation Sources
Presenter: Joe Heil
Over 30 years of applied Naval Warfare Systems Software Development and Leadership
20 years as a SW engineer for the Tactical Tomahawk Missile Weapon Control System (TTWCS)
− Software Developer, Group Lead, Branch Head, and Software Integrated Product Team (IPT) Lead
Current software leadership responsibilities:
− Chief & Principal Software Engineer for the NSWCDD* Strategic and Computing Systems Dept
− Lead: Naval Software Community of Practice (SW COP)
− Lead: Naval System Engineering Stakeholder Group (SESG) Software Working Group (SW-WG)
Primary Goal: Improving naval software system acquisition and engineering success via increased awareness and application of best software engineering practices
Presentation Information Sources
First hand experience leading complex software intensive warfare system development efforts
Over last few years: Chaired/Panel-Member for 300+ Warfare System Project Reviews
− Wide range of development approaches (Strategic, Waterfall, Agile, Rapid, Prototyping, etc)
− Wide range of Warfare Systems (Missiles, Guns, Lasers, Directed Energy, Non-lethal, Simulations, etc)
Numerous Studies & Reports
− Defense Science Board, Gov’t Accounting Office, DASN/RDTE, Crosstalk, etc)
2
* NSWCDD: Naval Surface Warfare Center, Dahlgren Division
3Unclassified Distribution A: Approved for public release; distribution is unlimited.
Presentation Objectives and Content
OBJECTIVES:Provide awareness of:
− Common software system acquisition and engineering challenges
− Proven techniques for software system acquisition and engineering success
− Naval Software Improvement and Collaboration Efforts
CONTENT− Context information
− Challenges
− Keys to Success
− Improvement Interactions
− Summary
3
4Unclassified Distribution A: Approved for public release; distribution is unlimited.
4
Context: NAVSEA Warfare CentersNaval Surface Warfare Center Dahlgren Division (NSWCDD)
NSWCDD provides full spectrum system engineering & development for Naval warfare systems. This includes hands-on development of system and software requirements, architecture, design, and code ; as well as system integration, test, and operational support responsibilities.
NSWC Port Hueneme
Port Hueneme, CA
NSWC Corona
Norco, CA
NSWC Panama City
Panama City, FLCombat Direction Systems Activity
Dam Neck, VA
NSWC Dahlgren
Dahlgren, VA
NUWC Keyport
Keyport, WA
NSWC Crane
Crane, IN
Naval Ship Systems
Engineering Station
Philadelphia, PANUWC Newport
Newport, RI
NSWC Headquarters
Washington, DC
NSWC Carderock
West Bethesda, MD
NSWC Indian Head
Indian Head, MD
EOD Technical Division
Indian Head, MD
5Unclassified Distribution A: Approved for public release; distribution is unlimited.
Context: Naval Surface Warfare Center Dahlgren Division (NSWCDD) Software Intensive Warfare System Development and Success
5
Demonstrated Complex Mission-Critical Software Engineering Success- Operational Success: Thousands of successful Tomahawk Strikes and Battle Management System precision Strikes
- Operational Success: Detect-track-engage systems (Gunslinger, Wolfpack, Battle Management System,,)
- Rapid technology transfer: Deployed Laser Weapon System Quick Reaction Capability (LWC-QRC)
- Decades worth of successful ballistic missile test launches
- Award winning (Al Gore Hammer Award) modeling and simulations
- High Quality Software: Defect ratios consistently less than 1 defect per KSLOC (thousand lines of code)
- Open Architected Systems: Common, scalable, reliable, multi-platform capable software architectures
Rapid Development Tactical Development Strategic Development
Also develop significant Simulations and Models: Ship Motion, Missile Flight, System Interfaces, Modeling/Sim Framework
Submarine Ballistic Missile Mission Planning and Fire Control Systems
Detect-Track-Engage Systems (Lasers, Guns, Non-Lethal)
Surface Warfare Mission Module (Littoral Combat Ship)Tomahawk Weapon Control System
Anti-Submarine Warfare SystemOthers..
* Not the complete set of NSWCDD software systems and products; just a small sample
6Unclassified Distribution A: Approved for public release; distribution is unlimited. 6
Software Intensive Warfare SystemsChallenges
Software Size, Reliance, Complexity, and Cyber Threats
Cost, Schedule, & Technical Challenges
* Verified in documented performance result reports of DOD software system acquisition
Challenges include ever increasing:
- Demand for faster and cheaper development and delivery of systems to meet emergent needs & threats
- Pressure to “cut corners” and not utilize rigorous and disciplined data-driven best-practices/processes
- Cyber warfare threats and associated Information and Software Assurance requirements
- System and System-of-System (SOS) complexity and inter-dependencies
- Rapid evolution of software technologies and methodologies
Primary Goal
Consistently deliver high quality and reliable software systems as efficiently as possible
Time
Number
Of
Government on-house applied Software Expertise
7Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software Challenges: Project Execution
Limited use of mature data-driven best-practice based software project execution and continuous improvement
Poor software effort estimation and tracking processes (non metrics based)
Subjective assessment of cost, schedule , quality, and security versus metrics based
Program Leadership with limited applied sw expertise; treat software as a “black-box”
Poor communication
Poor requirements management; significant requirements volatility
Software architects not included in early system engineering phases
Poor software architecture: not enough time spent on the initial arch/design; jump-into-coding
Misuse of software prototyping (failure to throw away or formalize prototype code)
Lack of formalized risk management processes
Limited utilization of government in-house applied software expertise; over-reliance on industry
Not a complete set, but includes some of the common major contributors to cost, schedule and technical failures
Based on Numerous DOD/DON Studies/reports and first hand experience (300+ project reviews over last few years)
8Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software Challenges: Cyber Threats
Threat
Created
Threat
Deployed
Threat
Discovered
Threat
Disclosed
Correction
Available
Correction
Deployed
System Vulnerability Window: Months to Years
Very expensive to
fix vulnerabilities
after Deployment
Cyber vulnerabilities
are increasing
I. SYSTEM ANALYSIS & CONTROL
User Centered , Risk Focused , and Total-Cost based Engineering
System Analysis and Assessment: Modeling & Simulation
Configuration Management
Mission & Stakeholder Analysis
Concept Generation
RequirementsDefinition
System ArchitectureInterface Definition
Subsystem/InterfaceDesign
Build and Unit Test(Hardware/Software)
Subsystem Integration/Test
System Integration &Test
Verification,Validation,
& Certification
Train, Operate,& Support
----------------------------Deploy/Install
Decommission & Dispose
Specialty Engineering:Anti-Tamper ,
Environmental, Human Systems Integration
Information Assurance, Logistics ,
Safety, Training,
etc.
Cannot “patch-in”
cyber security and
resiliencyResiliency is
not designed-in
Current “find-fix” cyber mitigation approach is re-active, slow, and costly
* This graph is dated but it is evident that cyber threats are still increasing
Reactive cyber threat mitigation process
- Focus on Information Assurance (IA) versus Software Assurance (Quality, Security, and Resiliency)
- Numerous disjointed and non-coordinated “Cyber/IA/IT” organizations; proliferation of policy
- Numerous cyber-tools; challenges to acquire/fund/train/integrate tools into development processes
- Software security and resiliency were not designed-into legacy systems / cannot afford to re-architect
- RDTE systems treated same as Business IT systems; cumbersome certification req’s, limits flexibility and responsiveness
9Unclassified Distribution A: Approved for public release; distribution is unlimited.
Keys to Software Engineering Success*
9
Mature Documented Data-Driven Processes
Work-Unit Based Software Effort Estimation and Tracking
Open Architecture
Agile/Rapid Development (Build-a-Little Test-a-Little )
Simulations and Data Extraction
Government and Industry Software-Development Teams
Engineer-In Software Assurance (Security, Quality and Resiliency)
Communication & Collaboration
* Not the complete set of best practices; just a selected subset of proven techniques
10Unclassified Distribution A: Approved for public release; distribution is unlimited.
Keys to Software Success Mature Documented and Measured Processes
INITIAL
MANAGED
DEFINED
QUANTITATIVELY
MANAGED
OPTIMIZING
Ad hoc;chaotic
Disciplined
Consistent process
(Institutionalized)
Predictable process
(METRICS)
Continuously improving
(Metrics Driven)
1
2
3
4
5
LOW
R I
S K
HIGH
Capability Maturity Model Integrated (CMMI)(A Framework for continuous improvement)
Applied TechnicalExpertise
TechnicalProcesses
ManagementProcesses
High quality products
Delivered on time & within budget
Continuous data-driven improvement
METRICS
DRIVEN
PROCESSES
Technical Expertise coupled with Data Driven Processes facilitates consistent delivery of
high quality software systems on schedule and within budget
10
All
Efforts
1. Define / Refine
Process & Metrics
2. Estimate
Cost, Schedule,
Quality
3 Track Actual
Cost, Sched, Quality
4. Analyze
Metrics
Data Driven
Continuous Improvement
11Unclassified Distribution A: Approved for public release; distribution is unlimited.
Keys to Project Execution SuccessBest-Practice-Based System Engineering Processes
The Business Case for Systems Engineering Study:
Results of the Systems Engineering Effectiveness Survey
SPECIAL REPORT CMU/SEI-2012-SR-009, November 2012
Low Complexity Project High Complexity Project
Pro
ject
Perf
orm
an
ce
Systems Engineering Capability Applied Systems Engineering Capability Applied
Value Proposition Value Proposition
Application of SE Best Practices
Of Projects deploying the most SE best practices,
over 50% delivered higher project performance.
Best Processes:
Project Planning
Project Monitoring and Control
Risk Management
Integrated Product Teams (IPT)
Configuration Management
Requirements Development and Management
Product Architecture
Trade Studies
Product Integration
Verification
Validation
System Engineering includes
Cost, Schedule, & Technical Performance
System Engineering best practices improves project execution.
System Engineering best practices include Project Planning & Control.
AppliedTechnicalExpertise
Data-DrivenManagementBest Practices
Data-DrivenTechnical
Best practices
ConsistentSuccess
12Unclassified Distribution A: Approved for public release; distribution is unlimited.
Metrics Driven Project Execution
GOAL: Deliver High Quality Products on Schedule and Within Budget
Question Metric Examples (Not the complete set)
Is the scope of the effort understood? Requirements Impacted (New, Modified, Deleted)
Estimated Hours/Dollars per Activity (SE, SW, Test, etc)
Are the requirements understood , documented, allocated, and stable ? Requirements Volatility over time
Requirements Allocation to Org’s and Eng/Test Activities
Is the project adequately staffed? Staffing profiles (by discipline and org)
Skill sets required versus on-board and available
Is the project on schedule? Integrated Master Schedule and Detailed Schedules
Planned vs. Actual Progress w/ variance explanations
Is the project on budget? Planned vs. Actual Cost w/ variance explanations
12
Is the project meeting quality goals? Defect Detection and Closure Trends, Defect Ratios
Requirements and Tests Passed vs. failed
Is the project FORMALLY managing risks? Open versus Closed Risk Trends
Is the project continuously improving? Cost , Schedule, and Quality variance reductions
Metrics should be:
•Utilized in all activities, and by all organizations
• Proactively and regularly utilized (not just at milestone reviews)
• Documented, well defined, value added, and easily understood
•Supported by a Software Level Work Breakdown System (WBS)
• Capable of being rolled up to higher levels of abstraction
• Continuously assessed and improved
13Unclassified Distribution A: Approved for public release; distribution is unlimited. 13
Software Products
Requirements
Architecture
Design
Source Code
Test Procedures / Reports
System IntegrationSystem Testing
System Engineering Products
Concept of Operations
System Level Req’s
Architecture Doc’s & Diagrams
Interface Specifications
H/W Drawings & Specifications
Integration & Test Products
Integration Plans
Test Plans
Test Procedures
Test Reports
OperationalTest
System Development
Activities and Artifacts
System Engineering
SW: Preliminary Design
SW: Coding
SW: Detailed Design
SW: Testing
SLOC based planning and tracking
relies on 1 single activity and 1 artifact
SW Effort EstimationProblem with SLOC Based Planning and Tracking
SLOC based estimation approach relies on several unrealistic assumptions:− Software engineers can accurately estimate hundred-of-thousands to millions of SLOC level efforts− SLOC based productivity factors (SLOC per Hour) are based on accurate & relevant historical data− SLOC productivity is indicative of other engineering activity productivity (Req’s, Design, Test, etc)− Constant effort relationship between SW activities and other engineering activities‘
14Unclassified Distribution A: Approved for public release; distribution is unlimited. 14
Software SuccessWork-Unit Based Effort Estimation and Tracking
DEFINE WORK-UNITS (WU)
Define Productivity (P)
(Hours per Work Unit )
CALCULATE EFFORT (Hours)
Development Activity Hours =
Estimated Work-Units *
Productivity Factor
SW Component Hours =
Sum of ALL Dev Activity Hours
Incremental Build Hours =
Sum of ALL SW Component
Hours
ESTIMATE WORK UNITS
- Utilize historical data from
previous similar efforts
TRACK PROGRESS
Track Work Units Produced
Track Actual Productivity
IMPROVE ESTIMATION
Compare Estimates to Actuals
- Revise Work Units
- Revise Productivity Factors
For Each Software Component and each SW development Activity (Requirements, Design, Code, Test):
Revise as required
I. SYSTEM ANALYSIS &
CONTROL
User Centered , Risk Focused , and Total-Cost based
Engineering
System Analysis and Assessment: Modeling &
Simulation
Configuration Management
Mission &
Stakeholder
Analysis
Concept
Generation
Requirements
Definition
System
Architecture
Interface
Definition
Subsystem/Interf
ace
Design
Build and Unit
Test
(Hardware/Softw
are)
Subsystem
Integration/Test
System
Integration &Test
Verification,
Validation,
& Certification
Train, Operate,&
Support
---------------------
-------
Deploy/Install
Decommission &
Dispose
Specialty Engineering:
Anti-Tamper ,
Environmental,
Human Systems Integration
Information Assurance,
Logistics ,
Safety,
Training,
etc.
System Eng “V” Chart
Define a “Work Unit(s)” per SE activity
Define associated productivity factor
Per each Work Unit
Work Unit Based Estimation & Tracking
REQ’s Hours per Requirement, Hours per Interface, etc.
DESIGN Hours per Object, Hours per DODAF view, etc
CODE Hours per Object, SLOC per Hour
TEST Hours per Test Development, Hours per Execution
15Unclassified Distribution A: Approved for public release; distribution is unlimited.
Sensor 2
Applications Layer
Unique
Components
Presentation Layer
Platform X
Unique
Components
Platform Y
Unique
Components
COMMON
Display
Components
Infrastructure Layer
(Common Utilities Components)
Common
Components
Weapon 1
Weapon 2KEY
COMMON
Application
Components
Platform Y
Unique
Components
Weapon NVirtualization Layer
(Independent of Processor or Op-System)
Sensor
Class Launcher
Class Common
Components
Modular Open Systems Architecture Common Design Approach
15
Weapons
Class Common
Components
Unique
Components
Platform X
Unique
Components
Launcher 1
Launcher 2
Launcher N
Sensor 1
Sensor N
Private
Industry
Developed
Gov’t Owned /
Developed
Standard I/F Government Develops and owns the core Architecture, Interfaces, and software
Industry develops the “Plug and Play” Sensors, Launchers, Weapons ComponentsOA: A system composed primarily of common software that can be utilized across a wide range of platforms with minimal changes
Aligns with ASN RD&A Memo to the SYSCOMs/Chief of Naval Research “Use of In-House Engineering and Technical Requirements” 23 Feb 2012
Core Common Framework
16Unclassified Distribution A: Approved for public release; distribution is unlimited.
ACTIVITIES:
Mission Level Req’s
Mission Level Architecture
Mission Level Interfaces
System Level Req’s
System Level Architecture
System Level Interfaces
Early System Level Testing
- Prototypes, Models, Simulations, Frameworks, ..
Software Level Req’s
Software Level Architecture
Software Level Interfaces
Early Software Component Level Testing
- Representative (if possible) HW and SW Components;
- Utilize Simulations and I/F drivers (GO AND FAULT PATHS)
- Cyber security and System Resiliency Testing
Integrated System
Engineering Builds
Integrated System
Platform based System Testing
Actual Hardware and Software
Software Development Best Practices
Early & Often Verification and Validation
- Prototyping
- Models and Simulations
- Automated testing
- Data Extraction & Analysis Tools
Software Components
Adherence to Data-Driven Best Practices
Software / System Integration Testing
- Actual and Representative HW and SW Components;
-Augmented by Simulations and I/F drivers
- Cyber Security. Penetration nTesting, and System Resiliency
Testing
Operational Support
Early Mission Level Validation
-Prototypes,
- Models, Simulations
Integrated System
Engineering BuildsIntegrated System
Test Builds
Software Components
Sprints / Incremental-builds
Build-A-Little Test-A-Little
Software Prototypes
Software Prototypes
Prototypes must be formalized: go through disciplined process
Automated
Testing in early
activities
Program Plan must include post-delivery support approach : Organizational responsibilities, funding, problem tracking
Well defined SW Support Activity
Quick recovery from exploited threats
Design-In
Software Security
and Resiliency
User-Centered and Risk Focused System/Software Engineering
Complete Requirements Traceability and Configuration Management
Formal Risk Management
Utilize Cyber
Security Tools
Defense-In-Depth :
Protect-Detect-Isolate-Endure-Recover
- Limit interfaces to external systems
- Identify and harden (firewall) critical control points (interfaces)
- Add processing to Detect cyber intrusions
- Isolate and limit consequences (protect mission critical components)-
- Endure through the Cyber intrusion to successfully complete the mission
- Return the system to a trusted state
17Unclassified Distribution A: Approved for public release; distribution is unlimited.
Rapid Software Development Goals
Get Capability to Warfighter More Quickly
17
Traditional/waterfall Agile
Short-term releasibility
- providing an early version of the software with a subset of its ultimate functionality
Requirements flexibility
- ability to quickly change requirements while the software development is in progress.
“Big Bang”
Delivery
Smaller / Faster
Deliveries
NOTE: Agile development may not be the appropriate approach for all projects
Program Leaders must assess if Agile is the best approach for their specific program needs and structure
18Unclassified Distribution A: Approved for public release; distribution is unlimited.
Rapid Software Development
Image available at www.mountaingoatsoftware.com/scrum
SPRINT 0
Test
REQ
Design
Code
EACH SPRINT
SPRINT 1 SPRINT 2 SPRINT x DELIVERY
GOAL: Each sprint is working software, subset of capabilities, & capable of being delivered
Requires frequent regular communication between user, sponsor, & developers
Rapid /Agile development is NOT an excuse to not have:− Documented data-driven processes
− Cost, schedule, and technical performance measures
− Requirement, Arch, Design and Test documentation
18
19Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software SuccessSimulations and Data Extraction
19
Msl
SimTactical
CSCI
Launcher Sim
Tracks Simulation
Other CSCI simulations
Navigation Simulation
TACTICAL
CSCITACTICAL
CSCITACTICAL
CSCI
Simulations
Must support both “go path” and “fault’ path scenarios
- Fault Paths: Send data out of sequence, out of bounds, at high rates, etc
Must include effort to develop/modify simulations in cost and schedule planning
Developed using disciplined processes; and ideally, by independent team
Data Reduction Program
Data Extraction File
All interface data,
critical internal states and data
Comm’s SimulationComm’s Simulation
Comm’s Simulation
Launcher SimLauncher Sim
Launcher Sim
Msl
SimMsl
SimOther CSCI simulations
Other CSCI simulations
Simulations
Data Extraction
Tactical Computer Software Configuration Items (CSCIs)
Key
20Unclassified Distribution A: Approved for public release; distribution is unlimited.
20
Software Success
Teaming: Industry & Government in-house expertise
Teaming With Industry
Time / Experience
System Component Level
System Level
Mission and Systems-Of-Systems Level
System
Development
Responsibility &
Complexity
System Sub-Components Level
Hands-On Applied Expertise
at all levels of complexity
Gov’t In-House Applied Expertise Pipeline
Many DOD/DON Program Managers have limited applied software expertise.
Must have some gov’t engineers responsible for actual development (not just contractor oversight)
- Provides Program Manager with business and technical advantages
- Facilitates controlling cost: government is not solely reliant on industry expertise
- Provides Industry with a true technical peer to help negotiate cost, schedule and technical approach
Hands-On Development
Government hands-on software development is required to:
• Perform as a smart buyer and successfully team with industry
• Maintain expertise with the latest technologies
Gov’t Software Experts team with Industry SW Experts to Define, Design, Develop, and Deliver Software Systems
Ind
DEFINE
System
Req’s
ARCHITECT
System &
Software
INTEGRATE
& TEST
System
G
%
G
I
G
I
DEVELOP
System &
Software
G
IKey:
G: NSWCDD SMEs
I: Industry Organizations
Software Development Responsibility
I
20
21Unclassified Distribution A: Approved for public release; distribution is unlimited. 21
Government and Industry Software Development TeamingWin - Win – Win - Win
Warfighter
− Faster receipt of capabilities
− Increased capabilities
− Higher quality and more reliable systems
Government Program Offices
− Improved Technology, Cost, and Schedule Estimates and Assessments
− Increased and maintained corporate knowledge
− Increased acquisition leverage and flexibility (more competition)
Industry
− Improved proposal assessments (smarter partner, not just lowest bid wins)
− Reduced risk (smarter partner, improved requirements, government accountability)
− More profit (less dollars on rework; increased system production and upgrades)
Taxpayer
− Better utilization of tax dollars
− High quality, reliable, secure systems = Better protection of serving family members
This software development teaming approach has been consistently successfully
utilized for some of the Navy’s most critical warfare systems
22Unclassified Distribution A: Approved for public release; distribution is unlimited.
ACTIVITIES:
Mission Level Req’s
Mission Level Architecture
Mission Level Interfaces
System Level Req’s
System Level Architecture
System Level Interfaces
Software Level Req’s
Software Level Architecture
Software Level Interfaces
Integrated System
Engineering Builds
Integrated System System Testing
Software SuccessCommunication: Integrated Product Teams (IPTs)
Software Components
Operational Support
Integrated System
Engineering BuildsIntegrated System
Test Builds
Software Components
Include senior level software architects during initial SE efforts
Throughout the system engineering process:
Utilize integrated multi-discipline product teams:
Include: User & Operator Rep’s, System Engineering,
Software Engineering, Test, Logistics, Safety , etc.
Communicate, Communicate, Communicate!
Identify & mitigate cost, schedule and technical risks
Daily “stand-up” meetings
Weekly: discipline specific (SE, SW)
Focused cost, schedule and technical
Performance and Risk meetings
22
Structured & Metrics Based
Communication
23Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software Assurance: Best Practices“Engineer-In” Quality, System Survivability, Security, and Resiliency
I. SYSTEM ANALYSIS & CONTROL
User Centered , Risk Focused , and Total-Cost based Engineering
System Analysis and Assessment: Modeling & Simulation
Configuration Management
Mission & Stakeholder Analysis
Concept Generation
RequirementsDefinition
System ArchitectureInterface Definition
Subsystem/InterfaceDesign
Build and Unit Test(Hardware/Software)
Subsystem Integration/Test
System Integration &Test
Verification,Validation,
& Certification
Train, Operate,& Support
----------------------------Deploy/Install
Decommission & Dispose
Specialty Engineering:Anti-Tamper ,
Environmental, Human Systems Integration
Information Assurance, Logistics ,
Safety, Training,
etc.
Programs should designate a
Chief SOFTWARE Architect in addition to
Chief System Engineer
Define Assurance and Resiliency Requirements
Architect/Design resilient “defense-in-depth” systems
Protect-Detect-Isolate-Endure-Recover
Utilize tools to remove vulnerabilities during development
Define and utilize metrics and tools to quantify
sw vulnerability, survivability, resiliency, and risk
Utilize data-driven best software engineering practices
Include independent Audits to ensure best practices
GOAL: Define, develop and deliver high quality, secure and resilient systems
Utilize processes, tools, and technology to facilitate
vulnerability/threat detection, isolation, and recovery
Cannot “test in’
system & software
security & resiliency
Information
Assurance Patches
only address
Known threats
Train the workforce on both sw best practices and
Cyber vulnerabilities, threats, tools, secure coding,
resilient design,/implementation
24Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software Assurance
Software reliance, complexity and cyber threats are increasing
Software Assurance focus must be more than just cyber security
Information Assurance (IA) compliance does NOT equal cyber security
There is no single “silver bullet” to ensure software quality and security
SW Assurance must be addressed throughout the system engineering life-cycle
Software Assurance requires application of software best practices, tools, and measures
Cannot “test-in” SW assurance, must “design-in” quality, security, and resiliency
Key to Success: Increased communication and collaboration
SW Assurance Definition (DOD): The level of confidence that software functions as intended and is free of vulnerabilities, either intentionally or unintentionally designed or inserted as part of the software throughout the lifecycle. (per DoDI 5200.44)
SW Assurance Definition (Software Engineering Institute)"Application of technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner, are free from accidental or intentional vulnerabilities, provide security capabilities appropriate to the threat environment, and recover from intrusions and failures."
Software System Characteristics Goals (more than just “secure”)- Secure, Safe, Reliable, Modular, Maintainable, Scalable, Portable, Defect Free- Resilient- Meets war fighter mission critical needs despite cyber intrusion
25Unclassified Distribution A: Approved for public release; distribution is unlimited.
Facilitate Awareness and Application of Best Software Engineering Practices via Increased Communication and Collaboration
Naval Software Improvement Interactions
25
DASN(RDTE): Software Lead
Terry Sellers (NSWCDD)
Naval System Engineering Stakeholders Group
SESG Software Working Group (SWWG)
Lead: Joe Heil (NSWCDD)
Naval Software Community of Practice (SW COP)
Lead: Joe Heil (NSWCDD)
NSWC: Naval Surface Warfare Center Divisions: Carderock, Crane, Corona, Dahlgren/Dam-Neck, Indian Head, Panama City, Pt. Hueneme
NUWC: Naval Undersea Warfare Centers DivisionNewport
NAWC: Naval Air Warfare Center DivisionsChina Lake, Patuxent River
MARCOR: Marine Corps
NOSSA: Naval Ordnance Safety and Security Activity
SPAWAR: Space and Naval Warfare Systems
Defense Acquisition University (DAU)
Naval Post Graduate School (NPS)
Software Engineering Institute (SEI)
Naval Postgraduate (NPS) School
Software Engineering Institute (SEI)
Defense Acquisition University (DAU)
DOD SW Assurance Community of Practice (SWA COP)
Naval SW Rep: Terry Sellers; NWCDD Participants: Joe Heil, Tammye Thornton, Carol Lee
Dept of Defense
Deputy Assistant Secretary of the Navy
KEY
Naval System Commands
Academia / Industry
Software Reliant System Program OfficesSoftware Reliant System Program Offices
Software Reliant System Program Offices
DASN/RDTE : Deputy Assistant Secretary of the Navy, Research Development Test and evaluation
NAWC: Naval Air Warfare Centers , NSWC: Naval Surface Warfare Centers, NUWC: Naval Under Sea Warfare Centers
JFAC: Joint Federated Assurance Center
FAA: Federal Aviation Agency
SOFTWARE IMPROVEMENT PRODUCTS :
OSD Defense Acquisition Guide (DAG)
OSD Program Protection Policy (PPP): SW Assurance
Naval Open Architecture Contracts and Metric Guide
Naval System Engineering Guide: Software Sections
Naval Guidebook for Acquisition of SW Reliant Systems
Naval SW Artifact Sharing Tool / Site
White Paper: Software Reliability
Development of DOD SWA
Best Practice Based Policies and
Guides
Sharing of Best Software Practices / Develop Studies & Reports
Development of Naval SW Best Practices
Based Policies, Guides, Standards
Generation of
Best Practice Based
Policies and GuidesApplication of
best SW Practices
Agencies: NASA and FAA SW leaders
DASN(RDTE): Program Protection: Claudia Morgenrood
SWA Technical WG: Carol Lee (NSWCDD)
SWA COP: JFAC Working Group
JFAC GOALS & APPROACH
26Unclassified Distribution A: Approved for public release; distribution is unlimited.
Naval Software Community of Practice (SW COP)Executive Summary
Background: In 2009, NSWCDD initiated the establishment of a Naval Software Community of Practice (SW COP)
Goal: Improve Naval warfare software system’s cost, schedule, and technical performance.- Share best practices, processes, tools, techniques and artifacts− Provide resources to help solve complex technical software problems− Provide access to software engineers with specialized expertise− Provide awareness of software laws, policies, guides and requirements− Promote maintaining government in-house applied software expertise
Approach: Increase communication and collaboration between government sw experts
Participation 280+ participating software experts from 17 different organizations
Results: 800+ artifacts posted to the knowledge sharing site
Results: 850+ hours saved via collaboration
26
PRODUCTS: SW COP provided inputs to DoD and DoN policies and guides:
OSD Program Protection Policy (PPP) SW Assurance (IN-WORK)
DOD SEI SW Sustainment Study (In-Work)
Office of the Secretary of Defense (OSD) Defense Acquisition Guide (DAG)
Naval Open Architecture (OA) Contracts Guide
Naval Open Architecture (OA) Metrics Guide
Naval System Engineering Guide: Software Sections
Naval Guidebook for Acquisition of Software Intensive Systems (SW Guidebook)
128 Requests
568 Responses
27Unclassified Distribution A: Approved for public release; distribution is unlimited. 27
Keys to Software System Acquisition Success
Summary
Applied TechnicalExpertise
TechnicalProcesses
ManagementProcesses
High quality products
Delivered on time & within budget
Continuous data-driven improvement
METRICS DRIVEN
Ind
DEFINE
System
Req’s
ARCHITECT
System &
Software
INT/TEST
System
G
I%
G
I
RESULT:
Government
Owned & Controlled
System Arch
& SoftwareG
I
DEVELOP
System &
Software
G
I
KEY
G: Government Og’s
I: Industry Organizations
Software Development Responsibility
Technical Expertise coupled with
Data-Driven Continuous Improvement
Gov’t Experts Teaming With Industry
HANDS-ON DEVELOPMENT
Utilize Data-Driven Best Practices and Maintain In-House Applied Software Expertise
Project Reviews
Quarterly Execution ReviewsProject Reviews
SETR Reviews
I. SYSTEM ANALYSIS &
CONTROL
User Centered , Risk Focused , and Total-Cost based
Engineering
System Analysis and Assessment: Modeling &
Simulation
Configuration Management
Mission &
Stakeholder
Analysis
Concept Generation
Requirements
Definition
System Architecture
Interface Definition
Subsystem/Interfac
e
Design
Build and Unit Test
(Hardware/Softwar
e)
Subsystem
Integration/Test
System
Integration &Test
Verification,
Validation,
& Certification
Train, Operate,&
Support
------------------------
----
Deploy/Install
Decommission &
Dispose
Specialty Engineering:
Anti-Tamper ,
Environmental,
Human Systems Integration
Information Assurance,
Logistics ,
Safety,
Training,
etc.
Gov’t SW engineers
Hands-On Full Spectrum Engineering
1. Define & Refine
Process & Metrics
2. Estimate: Cost,
Schedule, Quality
3. Track Actuals:
Cost, Schedule, Quality
4. Analyze
Metrics
ALL
PROJECTS
Best Practices
Maintaining and utilizing gov’t in-house applied software engineering expertise
Disciplined data-driven project execution and continuous improvement
Disciplined requirements management
Formal Risk Management
Open architected and defense-in-depth architected systems
Agile / Rapid / Build-a-little Test-a-little development methodologies
Design-in Software quality, Security and Resiliency
Increased communication and collaboration; sharing of best-practices
28Unclassified Distribution A: Approved for public release; distribution is unlimited.
Data Driven Success
28
“In God we trust, all others must bring data”.
W.E. Demming
“For every opinion, there is an equal and opposite opinion; but,
for every fact there is not an equal and opposite fact”
L. Albuquerque
“ Without data, you are just another person with an opinion”.
Unknown
“You cannot expect what you do not inspect”
MARCOR proverb
“Trust but verify”
Ronald Reagan
29Unclassified Distribution A: Approved for public release; distribution is unlimited.
backup
29
30Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software ChallengesIncreasing Software Size and Complexity
MillionsThousandsHundredsTens
System
Req’s and Interfaces
Components (CSCIs)
Objects
Files
Source Lines of Code (SLOC)
Low
High
Logic Paths, Threads of execution, Timing, etc
Co
mp
lexity
System Component Relative Sizes
SW Level Req’s and I/Fs
SW Component1
SW ComponentX
Segment X
OBJECT
XXX
SLOC 1SLOC 2SLOC XXX,XXX
Files
X,XXX
Software Elements
SYSTEM
Segment 1
Objects1
Files1 Millions
of
Elements
Many Warfare System Program Managers do not have applied expertise with software engineering.
Software is often treated as a “black box”; software size and complexity is not understood
30
31Unclassified Distribution A: Approved for public release; distribution is unlimited. 31
Software Challenges Dept Of Defense (DOD) Software System Acquisition Results
On Schedule
And Budget
Capabilities
Delivered
Percentage
Of
Programs
84%
39% 40%
50%
Operational
TestingExceeded
Nun-McCurdy
50%
DOD SOFTWARE SYSTEM ACQUISITION PERFORMANCE
100%
0%
FAILURES: Cost, Schedule or Technical Performance SUCCESS : Cost, Schedule or Technical Performance
Software intensive warfare system engineering efforts are not consistently successful
with regards to cost, schedule, and technical performance
Not
Delivered
Development
$ Spent
Spent on
Rework
Due to SW
Problems
Failed
Initial
Op-Tests
Exceeded
Cost
Threshold
Delivered
Late or
Over-Budget
References:
Government Accounting Office (GAO), Report to Congressional Committees Best Practices, February 2008.
Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, Report of the Defense Science Board (DSB) Task Force on Developmental Test and Evaluation, May 2008.
Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, Report of the Defense Science Board (DSB) Task Force on Defense Software, November 2000.
Secretary of Defense (SECDEF), SECDEF Memo: “Department of the Navy Acquisition”, December 2008.
Senator Carl Levin, U.S. Senate Committee of Armed Services Press Release, March 2009
32Unclassified Distribution A: Approved for public release; distribution is unlimited.
SW AssuranceFuture State
Known COTS Source
Limited Cyber Awareness
“White Box” Approach
Gov’t Purpose Built
H/W and S/W
GOTS & COTS
SW AND HW
COTS Reliant
SW and HW
Increased reliance on COTS (BLACKBOX)
Unknown COTS Source (supply chain)
Tools/Patches address known threats
Increasing unknown Cyber Threats
Exponential cost to inspect COTS
Pro-Active
Design-in System Resiliency
Protect-Detect-Isolate-Endure-Recover
Institutionalized Data-Driven Best Practices
1950-1970 1980-1990 2000-2016
Reactive Cyber Threat approach
Inconsistent best-practices
Known Arch & Design
“White Box” Approach
Paradigm Shift: Design In System Resiliency and Software Quality
Re-emphasize and Re-institutionalize data-driven software best practices
Future State Paradigm ShiftAssertion:
Cannot assume that all cyber vulnerabilities and threat vectors are known.
Cannot assume that systems are 100% secure and can be protected from penetration.
Utilize Defense-In-Depth and Design-In System Resiliency
Protect-Detect-Isolate-Endure-Recover
• Limit interfaces to external systems
• Identify and harden (firewall) critical control points (interfaces)
• Add processing to Detect cyber intrusions
• Isolate and limit consequences (protect mission critical components)
• Endure through the Cyber intrusion to successfully complete the mission
• Return the system to a trusted state
Current State
33Unclassified Distribution A: Approved for public release; distribution is unlimited.
Software SuccessOpen Architecture (OA) Characteristics
33
* Reference: OA Architectural Principles and Guidelines v 1.5.6, 2008, IBM, Eric M. Nelson, Acquisition Community Website (ACC) DAU Navy OA Website
Composability
The System Provides Recombinant Components that can be Selected
and Assembled in Various Combinationsto Satisfy Specific Requirements
Interoperability
Ability of Two or More Subsystemto Exchange Information and Utilize
that Information
Open Standards
Standards that are Widely Used,Consensus Based, Published and
Maintained by Recognized IndustryStandards Organizations
Maintainability
The Ease With Which Maintenance ofa Functional Unit can be Performed in
Accordance With Prescribed Requirements
Extensibility
Ability to add new Capabilities to SystemComponents, or to add Components
and Subsystems to a System
Modularity
Partitioning into Discrete, Scalable,and Self-Contained Units of Functionality,
With Well Defined Interfaces
Reusability
Ability for an Artifact to Providethe Same Capability in
Multiple Contexts
Diagram Keyis Enabled by
is Facilitated by
Open Architected Software System:
A system composed primarily of common software that can be utilized across a wide range of platforms with minimal changes
Open Systems Facilitate Achieving the Following:
Reduce life cycle costs, Reduce acquisition time
Increase system reliability, maintainability, and quality
Increase competition (small business develops components)
34Unclassified Distribution A: Approved for public release; distribution is unlimited.
Best Practices: Communication
Well defined and documented Statement of Work (SOW):
- Clearly Defined Tasking, Roles, Responsibilities and Deliverables
- Protect and Ensure government ownership rights of software deliverables
Establish and maintain frequent periodic structured communication
- Structured (standardized) agenda and data/metric based reporting (not subjective)
- Maintain planned vs. actual cost , schedule, and technical performance metrics
Document and track Risks: Utilize Risk Management Tools (e.g. Risk Exchange)
Ensure Cross Organizational and Engineering Discipline Communication
- Utilize Integrated Product Teams (IPTs): Customer, Users, System Eng’s, Software, Test, etc
Daily “stand-up” meetings with Program and Technical Leads : Short, Concise, Risk focused
Weekly specific engineering discipline (e.g. SE, SW) meetings
- Cost, Schedule, Technical Performance and Risk meetings
Event Driven Project Reviews: Technical Reviews, Delivery Readiness, Project Completion
34
Communicate, Communicate, Communicate!
Identify, communicate, and mitigate cost, schedule, and technical risks
top related