avnet services challenge24 devops assessment findings & recommendations
Post on 17-Jan-2016
214 Views
Preview:
TRANSCRIPT
Avnet Services Challenge24 DevOps AssessmentFindings & Recommendations
22015
Agenda
1. Assessment Overview
2. Findings
3. Recommendations
Assessment Overview
42015
Challenge24 DevOps Assessment Objectives
• Assess a project’s software development process and produce a Value Stream Map of its “Concept to Cash” timeline.
• Identify and quantify development bottlenecks, delays, wait time, or risks to quality.
• Identify tool air-gaps or manual processes that are impeding productivity and software quality.
• Assess the project’s maturity against Avnet’s proprietary DevOps Reference Maturity Model.
• Provide actionable recommendations for high-value areas for improvement.
52015
Overview
• Project Description: IVR – Interactive Voice Response system. The IVR is used by most incoming calls into BCBST and the outbound survey and blast campaigns. This team maintains the application including production support, product maintenance and system enhancements.
• Development Team Members: Stephen, Wes, Chris, Aarju, David, Senthil.
• Operations Team Members: Steve, Vijay, Terry, Michael, Scottie
• Assessment Sponsor(s): Terry, Trevin• Assessment Team Members(s): John, Rolf
62015
Assumptions
• This presentation does not cover Value Stream Mapping concepts or definitions.
• IVR’s typical User Story/Product Backlog Item (PBI) size is between 20 and 160 hours of work (Minor Enhancement in PPM).
• The transformation of one PBI from Concept to Operations was analyzed.
• To keep the VSM model and math simple, productivity inefficiencies related to meetings, sick time, vacations, etc. are not accounted for.
Findings
82015
General Findings
1. Transition to SCRUM• The IVR Development Team has been using SCRUM
for the past 11 months.• Sprints are 4 weeks in length.• The transition has been viewed as highly beneficial,
with productivity/throughput estimated to have increased anywhere from 50% to 200%+.
• Approximately 20% of each Sprint’s capacity is used to address Technical Debt and Refactoring to make it easier/more efficient to enhance IVR in the future.
• The team is continuing to identify and address areas for improvement to further increase productivity & quality.
92015
General Findings
2. Tool Use & Automation• As described later, Build, Deploy, and Testing are
predominantly manual.• Even though Build & Deploy are manual, they don’t
consume much time during a Sprint.• Thus, Build & Deploy automation itself will not
significantly increase productivity. However, the repeatability & standardization inherent in build & deploy automation will significantly reduce or even eliminate quality issues experienced to related to human error, freeing that time up for software development.
102015
Requirements Findings
1. Requirements are initially input into Pega/HP PPM. The majority are considered Minor Enhancements.
2. Information related to Requirements is duplicated and stored in 6 different tools/locations: Pega, HP PPM, spreadsheet on SharePoint, QualityCenter, ClearQuest, Scrummy.com
3. There is currently about 6 Sprints of Product Backlog and 6 Sprints of Technical Backlog.
4. Pent-up Product Backlog items have been delivered, and as a result, IVR Stakeholders are growing the backlog with new items.
112015
Design, Code, Unit Test Findings
1. A PowerPoint document is used to capture the current state of IVR menus, etc. A design tool such as Rational System Architect is not commonly used.
2. Source code is managed in ClearCase.
3. Engineers use their local machines to develop software. This practice has periodically introduced issues due to subtle differences in environments/configurations.
4. Unit testing is manual. An automated Unit Test harness such as Junit is not in place.
5. Over time, the software is being Refactored to allow for more database-enabled configuration. When complete, future changes can be implemented more quickly.
122015
Test Findings
1. QA receives software to test with about 5-6 days remaining in the Sprint, which occasionally is not enough time.
2. For the past two Sprints, Test Cases have been input into Quality Center. They are not yet being traced to Requirements in Quality Center.
3. Test execution is manual. Test results are stored in Quality Center and used for reporting, change ticket, and audit purposes.
4. Defects found in Test are not yet opened in Quality Center. This is intended to begin soon.
5. A regression test suite is not yet available. It is now being slowly created as test cases are entered/stored in Quality Center. The lack of a regression suite has caused sporadic quality issues over the past 11 Sprints.
6. Generally speaking within a Sprint, the defect identification/rework/retest cycle is very quick.
132015
Build/Deploy Findings
1. The Build/Deploy process is still very manual and not managed/controlled by one person, other than for production. Release backout/rollback is also manual.
2. Layers of the system stack such as OS, patches, COTS software, configuration files, etc. are not configuration managed or built as needed with the IVR application.
3. The manual build/deploy process has become better, but still periodically introduces issues for QA as well as Production.
4. Deployment to Production typically occurs 5 days after the completion of the 4 week Sprint.
5. Numerous approvals are required to deploy, but these do not appear to hamper or delay flow
142015
Operations Findings
1. Most server components of IVR are virtualized.
2. Server and OS health is well monitored.
3. Application health is somewhat monitored.
4. Automated problem resolution has not yet been implemented. e.g. Service Restarts
5. Operational issues have only resulted in 3 software defects being created. Traceability between operational issues and software defects/enhancement was not found.
6. A CMDB is not being used.
152015
Value Stream Mapping Basic Concepts
• Value stream maps are used to measure product development flow, and to identify/measure productive activities as well as wasteful ones.
• Typically, the flow of one item (enhancement request, feature, or user story) is mapped through the process. In the case of IVR we modelled:• Both a small (20 hour) and a larger (160 hour) minor enhancement• Both a short initial wait time (1 day) and a long initial wait time (21 days)
• Cycle Time (CT) is time invested adding value to measured item• Waste is time the measured item spends outside of CT and includes:
• Wait Time or Delays• Rework• Non-valued-added activities
• Efficiency is CT / Waste• Elapsed Time is the time to complete CT plus Waste for a single step• Overall Processing Time is calculated by adding the elapsed times
162015
Value Stream Map
Major Sources of Inefficiency
172015
Value Stream Analysis Observations
• Largest contributors to Value Stream inefficiency are times spent by a PBI:• Waiting to be allocated to a Sprint• Waiting for the Sprint to complete• Waiting between the completion of the Sprint and
Deployment to Production
• Rework/retest cycles are currently not significant sources of inefficiency
• Delays related to Approvals are not significant sources of inefficiency
182015
Cont Int & Delivery
DEV OPS
Bu
sin
es
s
Pro
ce
ss
Fra
me
wo
rks
&
Pra
cti
ce
sTo
ols
Customer Needs
Concept Def
Solution Dev
Test Build & Deploy Provision MonitorEvent/ Prob Mgmt
Customer Feedback
Enterprise Agile (SAFe, DAD), CMMI ITIL, COBIT
Agile Reqs Mgmt
Dev, Change &
Config Mgmt Automated Testing
Agile Arch & Design
Agile Project Management
Automated Organizational Dashboards
Automation
OrchestrationMonitoring
Cloud Mgmt
Self Service
Change & Config Mgmt
Automated Build & Deploy ToolReqs
MgmtSystem
IDE & SCM
SystemCloud Mgmt System
Arch & DesignTools
Application Lifecycle Management System
Development and Operations Intelligence System
Automation System
Orchestration SystemMonitoring SystemKnowledge Base
CMDBAuto Unit, Func, Non-Func, & SV Test Tools
2
2
2
2
3 35
2
2
2
2 335
Concept to Cash Value Flow
3
3
3
3
2
2
4
43
434
Avnet Services DevOps Reference Architecture
Agile Portfolio Management Production Support4 2
Portfolio Management System 4 ITSM System 2
192015
Cont Int & Delivery
DEV OPS
Bu
sin
es
s
Pro
ce
ss
Fra
me
wo
rks
&
Pra
cti
ce
sTo
ols
Customer Needs
Concept Def
Solution Dev
Test Build & Deploy Provision MonitorEvent/ Prob Mgmt
Customer Feedback
Enterprise Agile (SAFe, DAD), CMMI ITIL, COBIT
Agile Reqs Mgmt
Dev, Change &
Config Mgmt Automated Testing
Agile Arch & Design
Agile Project Management
Automated Organizational Dashboards
Automation
OrchestrationMonitoring
Cloud Mgmt
Self Service
Change & Config Mgmt
Automated Build & Deploy ToolReqs
MgmtSystem
IDE & SCM
SystemCloud Mgmt System
Arch & DesignTools
Application Lifecycle Management System
Development and Operations Intelligence System
Automation System
Orchestration SystemMonitoring SystemKnowledge Base
CMDBAuto Unit, Func, Non-Func, & SV Test Tools
Concept to Cash Value Flow
Mapping to Avnet Services DevOps Reference Architecture
Agile Portfolio Management Production Support
Portfolio Management System ITSM System
Recommendations
212015
Recommendations
1. Migrate to the new DevOps Platform. This will provide a large number of standardization and automation benefits including:
• Automated Build & Deploy via Rational Team Concert, Nexus Pro, and Urban Code Deploy
• Improved Scrum project management• Reduction in locations where Requirements are stored• Requirements and defects flowing automatically from
Quality Center to Rational Team Concert• Better dashboards and reporting since more
development data will be in one place versus scattered in numerous locations
222015
Recommendations
2. Implement an automated unit test harness
3. Better utilize HP Quality Center for Requirements and Test Management
• Enter ALL IVR requirements into QC• Associate existing and future test cases to requirements
in QC• Over time, build the regression test suite by adding
additional test cases not necessarily related to the work in a Sprint
• Investigate the potential for IVR test automation
232015
Recommendations
4. During the next IVR COTS upgrade cycle, leverage Urban Code Deploy to build IVR dev, test, and production virtual machine environments from bare metal on up
• This would automate and standardize the creation of IVR environments eliminating configuration drift
• Quality & productivity would improve since issues related to configuration differences between environments would be eliminated
242015
Future Considerations
• Shorter Duration Sprints• Kanban• Automated Performance & Security Tests• Static Code Analysis• Test Driven Development• CMDB• Operations Automation & Orchestration
Questions?
Thank You!
top related