scram: a method for assessing scheduleassessing schedule compliance risk · scram: a method for...
TRANSCRIPT
SCRAM: A Method for Assessing ScheduleAssessing Schedule Compliance Risk p
SSTC 2011Salt Lake City, Utah May 2011
Eli b th (B t ) Cl k B df d Cl kElizabeth (Betsy) Clark Bradford ClarkSoftware Metrics Inc. Software Metrics Inc.Haymarket, VA Haymarket, VA
Angela Tuffley Adrian PitmanSystems and Software Quality Institute Defence Materiel Organisation Queensland, Australia Australian Dept of Defence
Report Documentation Page Form ApprovedOMB No. 0704-0188
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.
1. REPORT DATE MAY 2011 2. REPORT TYPE
3. DATES COVERED 00-00-2011 to 00-00-2011
4. TITLE AND SUBTITLE SCRAM: A Method for Assessing Schedule Compliance Risk
5a. CONTRACT NUMBER
5b. GRANT NUMBER
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S) 5d. PROJECT NUMBER
5e. TASK NUMBER
5f. WORK UNIT NUMBER
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Software Metrics Inc,Haymarket,VA,20168
8. PERFORMING ORGANIZATIONREPORT NUMBER
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)
11. SPONSOR/MONITOR’S REPORT NUMBER(S)
12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited
13. SUPPLEMENTARY NOTES Presented at the 23rd Systems and Software Technology Conference (SSTC), 16-19 May 2011, Salt LakeCity, UT. Sponsored in part by the USAF. U.S. Government or Federal Rights License
14. ABSTRACT
15. SUBJECT TERMS
16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as
Report (SAR)
18. NUMBEROF PAGES
42
19a. NAME OFRESPONSIBLE PERSON
a. REPORT unclassified
b. ABSTRACT unclassified
c. THIS PAGE unclassified
Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18
What does SCRAM mean?
Go away!
Secure Continuous Remote Alcohol Monitoring
As modeled here by Lindsay LohanLindsay Lohan
Schedule Compliance Ri k A tRisk Assessment Methodology
2
SCRAM
Collaborative effort:Australian Department of Defence -Defence Materiel Organisation
ScheduleCompliance Defence Materiel Organisation
Systems and Software Quality Institute, Brisbane, AustraliaSoftware Metrics Inc Haymarket VA
ComplianceRiskAssessment Software Metrics Inc., Haymarket, VAAssessmentMethodology
3
DMO SCRAM Usageg
SCRAM has been sponsored by the Australian Defence Materiel Organisation (DMO)Materiel Organisation (DMO)
To improve our Project Schedule Performance in response to Government concern as identified by the Australian National Audit Office (ANAO)Audit Office (ANAO)
ANAO is equivalent to the US Government Accountability Office (GAO)
DMO equips and sustains the Australian Defence Force (ADF)
Manages 230+ Major Capital Equipment Projects & 100 Minor (<$20M) defence projects
4
DMO SCRAM Usage (cont.)g ( )
SCRAM has evolved from our reviews of troubled programsprograms
Schedule is almost always the primary concern of program stakeholders (delays to war fighter capability unacceptable)SCRAM is a key component of our initiative to identify andSCRAM is a key component of our initiative to identify and remediate (and eliminate) root cause of schedule slippage
5
Topicsp
Three Common Questions Addressed by SCRAM
Benefits of Using SCRAM
SCRAM Key Principles
SCRAM Process
Future plans for SCRAM
6
Three Common Questions
SCRAM addresses three fundamental questions.1 Why is schedule slipping?1. Why is schedule slipping?
Root cause analysis2. Is the schedule credible?
Assess risk and identify Issues (including estimated rework)Assess risk and identify Issues (including estimated rework)Assess BoEs (Basis of Estimate)Perform schedule “Health Check”Perform Monte Carlo analysis using inputs from other SCRAMPerform Monte Carlo analysis using inputs from other SCRAM areas
3. How can future slips be prevented?General recommendations based on SCRAM review findingsGeneral recommendations based on SCRAM review findingsGuidance on “leading indicators” of slippage
7
What SCRAM is Not
Not an assessment of technical feasibilityy
Not an assessment of process capability However, may be identified and treated as an issue if process performance is identified as contributing to slippage
8
Why is schedule slipping?y pp g
Program managers are flooded with a wealth of data and detailsdetails
Challenge is to organize all of this informationIdentify cause(s) of slippageSchedule slippage is a symptom of other factorsSchedule slippage is a symptom of other factorsTake effective action to address problems
Organizing the information based on SCRAM should:D l tt th i t f i f ti j tDe-clutter the massive amounts of information on a projectRelate the different issue areas to each otherHighlight missing information
SCRAM is based on a “Root Cause Analysis of Schedule Slippage - RCASS” modelpp g
9
Root Cause Analysis of Schedule Slippage (RCASS) Model
After many assessments, refined RCASS for
id i
Stakeholders
RequirementsSubcontractors
guidance in:Categorizing the wealth of data and details
Functional Assets Workload
Staffing & Effort
Rework
Assessing the causes of slippageRecommending a going-
Staffing & Effort
Schedule & Duration
Schedule Execution
Management & Infrastructure
g g gforward plan
Adapted from Integrated Analysis Model in McGarry et al., Practical Software Measurement: Objective
10
Practical Software Measurement: Objective Information for Decision Makers
SCRAM-RCASS
Stakeholders
RequirementsSubcontractors
Functional Assets Workload Rework
Staffing & Effort
Management &
Schedule & Duration
Schedule Execution
Management & Infrastructure
11
Root Cause Analysis - Examples
Stakeholders
Functional Assets
Requirements
Workload
Staffing & Effort
Rework
Subcontractors
y p
Stakeholders
Staffing & Effort
Schedule & Duration
Schedule Execution
Management &
Infrastructure
Stakeholders“Our stakeholders are like a 100-headed hydra – everyone can say ‘no’ and no one can saycan say no and no one can say ‘yes’.”
RequirementsRequirementsMisinterpretation of a communication standard led to an additional 3,000 requirements to implement the standard.
12
Root Cause Analysis - Examples
Stakeholders
Functional Assets
Requirements
Workload
Staffing & Effort
Rework
Subcontractors
y p
SubcontractorSubcontractor omitting processes in order to make delivery
Staffing & Effort
Schedule & Duration
Schedule Execution
Management &
Infrastructure
Subcontractor omitting processes in order to make delivery deadlines led to integration problems with other system components.
Functional Assets (COTS/MOTS)Commercial-off-the-shelf (COTS) products that do not work as advertised, resulting in additional work or replacement with different products.Underestimating amount of software code that must be
i / difi d i lwritten/modified in a legacy system.
13
Root Cause Analysis - Examples
Stakeholders
Functional Assets
Requirements
Workload
Staffing & Effort
Rework
Subcontractors
y p
WorkloadOptimistic estimates
Staffing & Effort
Schedule & Duration
Schedule Execution
Management &
Infrastructure
Optimistic estimatesSource lines of code underestimatedContract data deliverables workload often underestimated by both contractor and customercontractor and customer
Staffing & EffortHigh turnover, especially among experienced staff
Schedule & DurationSchedule & DurationArea of primary interest
14
Root Cause Analysis - Examples
Stakeholders
Functional Assets
Requirements
Workload
Staffing & Effort
Rework
Subcontractors
y p
Schedule ExecutionSchedule replans are not communicated to program staff or
Staffing & Effort
Schedule & Duration
Schedule Execution
Management &
Infrastructure
Schedule replans are not communicated to program staff or stakeholdersLack of, or poorly integrated, master schedule Integrated schedule elements not statused consistently acrossIntegrated schedule elements not statused consistently across program. Actual status unknown.External dependencies not integrated or tracked
ReworkOften underestimated or not planned for (e.g. defect correction)
Management & InfrastructureManagement & InfrastructureLack of adequate test facilities (in terms of fidelity or capacity)
15
Three Common Questions
1. Why is schedule slipping?Root Cause Analysis of Schedule Slippage - RCASS modelRoot Cause Analysis of Schedule Slippage - RCASS model guides the analysis approach
2. Is the current schedule credible?Assess the risks and issuesAssess the BoEs (Basis of Estimate)Perform “Schedule Health Checks”Perform Monte Carlo analysis
3. How can future slips be prevented?
16
Assess the Risks and Issues
Are risks and issues understood and managed?
What mitigations are in place to address the risks?
Have the issues been analyzed to determine corrective actions?
Are corrective actions being managed through to closure?
Is there contingency in the schedule if risks are realized?Is there contingency in the schedule if risks are realized?Or is the schedule so tight that nothing can go wrong?
17
Assess the BoEs
Technical expertise is essential
Basis of estimate will vary by phase and activityRequirementsqSource Lines of CodeTest cases/procedures
Evidence of use of historical data, models
18
Schedule Health ChecksTo evaluate schedule construction and logic
Includes analyses of task dependencies, task constraints, and y p , ,available schedule float
WBS and Master Schedule are reviewed for alignmentWBS and Master Schedule are reviewed for alignment
Government, Prime, and Subcontractor schedule Go e e , e, a d Subco ac o sc edu eintegration / alignment is reviewed
Ensure external dependencies are included and linked in the schedule
Interfaces resources facilities Government FurnishedInterfaces, resources, facilities, Government Furnished Equipment (GFE), test assets etc. 19
Schedule Health Checks (cont.)( )
Allocate three point estimates to tasks on critical and near-critical path based on identified risk from RCASS
optimistic, pessimistic & most likely task duration
Perform Schedule Risk Simulation (e.g. Monte Carlo)
20
Monte Carlo Analysis Exampley p
21
!;) Open Plan Professional - (Risk Histogram VIew ~rEJ ;jf Fdl Edrt y,._, ProjKt Too.. Add-rr. Wrdow Ha"
i,) ..:J • ..l ... 0 0 111 0 ¥ <1> ..1 ••• <~> • o<~> -u ~ <!><!>
" •o
,. ,. ,. 32
30
,. 28
" I 22
i 2~
1. .. .. 12
10
... ~ ...... -- 01 MNnf..,,...f"ntt _ 1...,. $td l)ro> Of hr'f't ,.nlltl ... o• 030<117 fhiiSvstem.A.oceoe.-.oeC$liA:J!nN:..nofS0t) - o• ...........
02 030olt7 D F"INIISyat_,.~..-.oe~*OnotSOt)
.............. 0~
030417 .... ... """ 01S01 , ..... 03().117 f~ ... .. 1 .. ·-.!J __j ,., ........ ,
~SSQi Q ty lnsbMe
• ii' X
----------~·~--~=-~'-~ g JP.JA A..._ HoMO!.,_ · R-.H-'*'OIF_.,.,v...,. -=-I 100 ..
~ .. 85
00
75
70 .. ..
1---' 55 j ,. ~ 45
j
1---' .. "" ,. 25
I--20
15
- 10
5
0 11119 2021 2223 .. ,., 2G 27 29 2030 0102 ., .... ~., ... C8 10 11 12 13 14 1 5 Hi 17 11 151 20 21 22 2'32<125262728 ~30l101020J04 OS OG 07 ~ 09 10 11 1 2 1J U 15 IG 17 18 1 920212223
0 0 0 0 0 0 0 0 0 0 0 0 0 • • • 0 0 0 1 1 2 , • 3 • . . • • 5 •
Dot
• • 1 1 1 • 2 2 , , • . , • • • • I II 1t It I) IS t1 20 23 23 23 l6
.t-,Ji.:.) ~ Au .. l r.tll.m (,o\L'rnnu·nt
~~ ... ~' l kjt.trlnu nluf lhf(fl\"t
lh!<.Jl<. <. \lrt,rt~. ] ()r~Hl l '-111"!1
, , . , •• . . . . . • . . , • • n»~~~Qo~ss~~~~~~nNN'
.... , , 2 ' •• ' 2 1 1 1 • • 7eet83e:Se7e7 78890-1181 9'29292
.!.! No Flttsr No Scrt
smi • Sofrware Metrfca Inc.
Three Common Questions
1. Why is schedule slipping?Root Cause Analysis of Schedule Slippage - RCASS modelRoot Cause Analysis of Schedule Slippage - RCASS model guides the analysis approach
2. Is the current schedule credible?Assess the BoEs (Basis of Estimate)Perform schedule “health checks”Perform Monte Carlo analysis
3. How can future slips be prevented?General recommendations based on SCRAM assessmentGuidance on measurements to serve as “leading indicators” ofGuidance on measurements to serve as “leading indicators” of future slippage
22
SCRAM Recommendations - Examplesp
Clarify the delivery scope (requirements and acceptance criteria)criteria)Create an Integrated Master ScheduleTest Procedure development should be more closely p ytracked and time should be added to the schedule for their review and correction Additional time in all test phases should be added for reAdditional time in all test phases should be added for re-running tests that fail or are blockedEnhance fidelity of integration lab to improve defect y g pidentification
23
Root Cause Analysis of Schedule SlippageRoot Cause Analysis of Schedule Slippage Model
Stakeholders
Provides guidance for collection of measurements
Stakeholders
RequirementsSubcontractors
measurementsFor visibility and tracking in those areas where there are risks
Functional Assets Workload
Staffing & Effort
Rework
are risksSchedule &
DurationSchedule Execution
Management & Infrastructure
24
Topicsp
Three Common Questions Addressed by SCRAM
Benefits of Using SCRAM
SCRAM Key Principles
SCRAM Process Reference / Assessment Model
Future plans for SCRAM
25
SCRAM Benefits
SCRAM root-cause analysis model (RCASS) useful in communicating the status of programs to all keycommunicating the status of programs to all key stakeholders
Particularly executive management
Identifies Root Causes of schedule slippage and permits early remediation action y
Provides guidance for collection of measures Provides visibility and tracking for those areas where there is risk
Provides confidence in the scheduleProvides confidence in the schedule26
SCRAM - Benefit
Validate schedule before execution
Widely applicableSCRAM can be applied at any point in the program life cyclepp y p p g ySCRAM can be applied to any major system engineering activity or phase
ExamplesSoftware-Hardware IntegrationAircraft Flight TestingInstallation/integration of systems on shipLogistics ERP application roll out readinessg pp
27
Topicsp
Three Common Questions Addressed by SCRAM
Benefits of Using SCRAM
SCRAM Key Principles
SCRAM Process Reference / Assessment Model
Future plans for SCRAM
28
SCRAM Key Principlesy p
Minimal DisruptionInformation is collected one person at a timeInformation is collected one person at a timeInterviews typically last an hour
Independent Review team members are organizationally independent of the program under review
Non-advocate All significant issues and concerns are considered and reported regardless of origin or source (Customer and/or Contractor).Some SCRAM reviews have been joint contractor/customer team – facilitates joint commitment to resolve outcomes
29
SCRAM Key Principles (cont.)y p ( )
Non-attributionInformation obtained is not attributed to any individualInformation obtained is not attributed to any individualFocus is on identifying and mitigating the risk
Corroboration of EvidenceSignificant Findings and Observations based on at least two independent sources of corroboration
Rapid turn-aroundOne to two weeks spent on siteOne to two weeks spent on-siteExecutive briefing presented at end of second week
30
Topicsp
Three Common Questions Addressed by SCRAM
Benefits of Using SCRAM
SCRAM Key Principles
SCRAM Process
Future plans for SCRAM
31
SCRAM Process
1.0 Assessment Preparation
2.0 Project Awareness
3.0 Project Risk / Issue
Identification
4.0 Project Schedule Validation
Schedule C li
5.0 Data Consolidation &
Validation
6.0 Schedule Compliance Risk
Analysis
7.0 Observation & R ti
Compliance Risk
Quantified
32
& Reporting
SCRAM Team Compositionp
Assessment conducted by a small team including:
Engineering AssessorsValidate WBS, engineering-related basis of estimates (BoEs), work l d ti t t h i l i k tload estimates, technical risk assessment
Scheduler experienced in the project schedule toolValidates schedule – conducts schedule health checks Performs Monte Carlo risk modelling
Other project domain specialists as neededE.g. Aeronautical Flight Test Engineers
33
SCRAM Key Stepsy p
SCRAM Team briefs the Project on the principles, purpose and approach of the SCRAMpurpose and approach of the SCRAM
The Project provides the SCRAM team with an initial j poverview of the current status and project issues
P j t I d Ri k fi d b th SCRAMProject Issues and Risks are confirmed by the SCRAM Team through interviews, reviewing documentation and other project assetsp j
Schedule health checks and Monte Carlo analysis are performed
34
SCRAM Key Steps (cont.)y p ( )
Executive out brief is prepared and presentedExecutive out brief is prepared and presentedObservations, findings and recommendationsPresentation structured using the RCASS model
Shows cause and effect linkageFindings allocated a risk code ratingPresented at the end of the second week
The final report is prepared and delivered (an additional two weeks)two weeks)
35
SCRAM Findings - Examplesg p
Sample Findings with Risk Code RatingPOSITIVE:POSITIVE:
Functional requirements based-lined and agreed; no evidence was identified of requirements churn or creep
POTENTIAL RISK: Limited schedule contingency exists for further rework
HIGH RISK: Lack of an integrated high-level schedule precludes the ability to
t l f t j t il t hi taccurately forecast project milestone achievements 13 major schedules not integrated at the program level
36
Process Reference / Assessment Model
Developed as an ISO/IEC 15504 conformant Process Reference Model and Process Assessment ModelReference Model and Process Assessment Model
Funded by the Australian Defence Materiel Organisation (DMO)Developed by
S t d S ft Q lit I tit t d S ft M t i ISystems and Software Quality Institute and Software Metrics Inc.Delivered June 2010The models are publicly available to download from:
http://www.scramsite.org
37
Topicsp
Three Common Questions Addressed by SCRAM
Benefits of Using SCRAM
SCRAM Key Principles
SCRAM Process
Future plans for SCRAM
38
Future PlansCurrently developed Diagnostic SCRAM (D-SCRAM)
Full scale application of the method to evaluate challenged pp gprojects or Projects of Concern. Used to assess likelihood of schedule compliance, root cause of schedule slippage and to recommend remediation of project pp g p jissues
Further evolve the SCRAM process for:Pro active SCRAM (P SCRAM)Pro-active SCRAM (P-SCRAM)
To be conducted prior to Contract or at Integrated Baseline Review (IBR) to ensure common systemic issues are avoided before the Program Schedule is contracted or baselinedProgram Schedule is contracted or baselined
Monitor SCRAM (M-SCRAM)Reduced version of D-SCRAM that maybe used to monitor project status – project health check performed ad hoc or conducted tostatus project health check performed ad hoc or conducted to support appropriate Gate Reviews
39
Future Plans (cont.)( )
SCRAM Training & Assessor QualificationsSCRAM Training & Assessor Qualifications
SCRAM Process Reference and Assessment ModelSCRAM Process Reference and Assessment Model Further revisions
Based on feedback from use during SCRAM assessments andChange Requests (Appendix D in the model)Change Requests (Appendix D in the model)
SCRAM Assessment ToolPrototype has been usedUnder development
40
SCRAM
QUESTIONS
For further information contact:For further information contact:
Govt to Govt - Adrian Pitman: [email protected] - Angela Tuffley: [email protected] - Betsy Clark: [email protected]
41
USA Betsy Clark: betsy@software metrics.comUSA - Brad Clark: [email protected]
AcronymsyANAO – Australian National Audit OfficeBoE – Basis of EstimateCOTS/MOTS – Commercial off the Shelf/Modified off the ShelfDMO – Defence Materiel Organisation (Australia)GAO – Government Accounting OfficeGFE – Government Furnished EquipmentISO/IEC – International Organization for Standardization/International Electrotechnical CommissionISO/IEC 15504 – Information Technology – Process AssessmentRCASS – Root Cause Analysis of Schedule SlippageSCRAM – Schedule Compliance Risk Assessment MethodologySMI – Software Metrics Inc. (United States)SSQi – Systems & Software Quality Institute (Australia)
42