using model-based systems engineering to improve customer
TRANSCRIPT
Using Model-Based Systems Engineering to Improve Customer Satisfaction and
Service Availability and Efficiency in the Implementation of ITIL
by Khaled H. AlAjmi
B.S. in Systems Engineering, May 1996, King Fahd University of Petroleum and
Minerals
M.S. in Systems Engineering, May 1999, King Fahd University of Petroleum and
Minerals
A Praxis submitted to
The Faculty of
The School of Engineering and Applied Science
of The George Washington University
in partial fulfillment of the requirements
for the degree of Doctor of Engineering
January 10, 2019
Praxis directed by
John M. Fossaceca
Professional Lecturer of Engineering and Systems Engineering
ii
The School of Engineering and Applied Science of The George Washington University
certifies that Khaled Husain AlAjmi has passed the Final Examination for the degree of
Doctor of Engineering as of January 10, 2019. This is the final and approved form of the
Praxis.
Using Model-Based Systems Engineering to Improve Customer Satisfaction and
Service Availability and Efficiency in the Implementation of ITIL
Khaled H. AlAjmi
Praxis Research Committee:
John M. Fossaceca, Professional Lecturer of Engineering and Systems
Engineering, Praxis Director
Amir Etemadi, Assistant Professor of Engineering and Applied Science,
Committee Member
Muhammad Islam, Professional Lecturer of Engineering and Systems
Engineering, Committee Member
iii
© Copyright 2019 by Khaled H. AlAjmi
All rights reserved
iv
Dedication
To El Bachir Boukherouaa!
v
Acknowledgements
The author wishes to acknowledge the research advisor, Dr. John Fossaceca, for
the endless support and dedicated guidance throughout this research. The author wishes
to also acknowledge the support and guidance of Dr. Muhammad Islam and Dr. Amir
Etemadi.
vi
Abstract of Praxis
Using Model-Based Systems Engineering to Improve Customer Satisfaction and
Service Availability and Efficiency in the Implementation of ITIL
The information technology infrastructure library (ITIL) framework is widely
used to manage the strategy, design, transition, operation, and continual improvement of
IT services. While the ITIL framework itself has undergone numerous revisions and
refinements, successfully managing ITIL implementation within organizations is
challenging due to several limitations. These limitations are associated with managing
collaboration and communication within organizations, meeting stakeholder requirements
and service quality objectives, managing risk, and practicing effective decision making.
Although modeling approaches have generally been used to analyze ITIL
implementation, such approaches tend to focus on individual ITIL modules or on a
specific implementation limitation as opposed to the entire ITIL framework.
To address this, we propose the use of model-based systems engineering (MBSE),
which has been shown to provide benefits such as improved collaboration among
stakeholders, enhanced decision-making practices, reduced operational risk, and
improved quality of service to organizations. Because MBSE spans the entire life cycle of
products and services, it has the potential to holistically improve the implementation of
ITIL across an organization. This report proposes an MBSE approach for ITIL
implementation that will result in improvements to customer satisfaction and service
availability and efficiency. Our MBSE approach utilizes the general-purpose Systems
Modeling Language (SysML). The proposed SysML-based ITIL implementation is also
vii
augmented with simulations to validate improvement recommendations for a real-world
use case.
viii
Table of Contents
Dedication ......................................................................................................................... iv
Acknowledgements ........................................................................................................... v
Abstract of Praxis ............................................................................................................ vi
List of Figures .................................................................................................................... x
List of Tables ................................................................................................................... xii
List of Acronyms ............................................................................................................ xiii
Chapter 1—Introduction ..................................................................................................... 1
1.1 Background ....................................................................................................... 1
1.2 Information Technology Infrastructure Library ................................................ 2
1.3 Systems Engineering and Engineering Management ....................................... 4
1.4 ITIL and Systems Engineering ......................................................................... 5
1.5 Model-Based Systems Engineering .................................................................. 7
1.6 Research Motivation ......................................................................................... 8
1.7 Problem Statement ............................................................................................ 8
1.8 Thesis Statement ............................................................................................... 9
1.9 Research Objectives .......................................................................................... 9
1.10 Research Questions and Hypotheses .............................................................. 9
1.11 Scope of Research ......................................................................................... 11
1.12 Research Limitations .................................................................................... 11
1.13 Organization of Praxis .................................................................................. 12
Chapter 2—Literature Review .......................................................................................... 13
2.1 Introduction ..................................................................................................... 13
ix
2.2 Challenges in Managing the Implementation of ITIL Initiatives ................... 16
2.3 Using Modeling and Simulation to Support ITIL Implementation
Initiatives............................................................................................................... 19
2.4 Using SE to Support the Management of Complex and Challenging
Initiatives............................................................................................................... 21
2.5 MBSE and Managing ITIL Implementation Initiatives .................................. 23
2.6 Using MBSE to Address ITIL Implementation Challenges ........................... 29
Chapter 3—Methodology ................................................................................................. 33
3.1 Introduction ..................................................................................................... 33
3.2 Using MBSE to Model ITIL ........................................................................... 33
Chapter 4—Results ........................................................................................................... 52
4.1 Introduction ..................................................................................................... 52
4.2 An ITIL Implementation: Case Study ............................................................. 55
Chapter 5—Conclusions, Challenges, and Recommendations for Future Research ........ 73
5.1 Conclusions ..................................................................................................... 73
5.2 Challenges ....................................................................................................... 73
5.3 Recommendations for Future Research .......................................................... 74
References ......................................................................................................................... 75
Appendix A: Results of Model Validation ....................................................................... 82
x
List of Figures
Figure 1-1. IT Enablement of Business Functions.............................................................. 1
Figure 1-2. ITIL Modules and Processes (Axelos, 2011). .................................................. 3
Figure 1-3. Systems Engineering Processes (Walden et al., 2015). ................................... 6
Figure 1-4. Process Similarities and Differences between ITIL and
Systems Engineering. .......................................................................................................... 7
Figure 2-1. Standard SysML Diagrams (Walden et al., 2015). ........................................ 23
Figure 2-2. Venn Diagram Depicting the Literature Gap in Using MBSE
for ITIL Implementation. .................................................................................................. 24
Figure 2-3. The Proposed MBSE Approach for ITIL Implementation. ........................... 32
Figure 3-1. Overall ITIL Implementation Project Structure Using SysML. ..................... 35
Figure 3-2. Proposed MBSE Organization. ...................................................................... 36
Figure 3-3. Proposed MBSE Architecture Framework ..................................................... 37
Figure 3-4. Block Definition Diagram of ITIL Service Strategy Module. ....................... 38
Figure 3-5. Requirement Definitions for the Demand Management Service. .................. 39
Figure 3-6. Activity Diagram for Demand Management Service..................................... 39
Figure 3-7. Encapsulating CSFs and KPIs in the Definition of
Demand Management Service. ......................................................................................... 40
Figure 3-8. MBSE Representation of Demand Management Service. ............................. 42
Figure 3-9. Modeling Incident Management Requirements Using SysML. ..................... 44
Figure 3-10. Modeling Incident Management Hierarchy Using SysML. ......................... 45
Figure 3-11. Modeling Incident Management Stakeholder Relationships
Hierarchy Using SysML. .................................................................................................. 45
xi
Figure 3-12. Modeling Incident Management Activities Using SysML. ......................... 46
Figure 3-13. Modeling the Identify and Log Incident Activities Using SysML. ............. 47
Figure 3-14. Modeling the Incident Management Using a Sequence Diagram. ............... 48
Figure 3-15. Simulating the Incident Management Model Using a State Machine
Diagram............................................................................................................................. 49
Figure 3-16. Use Case Diagram of Incident Management. .............................................. 49
Figure 3-17. Invoking MATLAB to Simulate the Incident Management Model. ............ 50
Figure 3-18. MBSE Representation for Incident Management Service. .......................... 51
Figure 4-1. Incident Arrivals - Histogram. ....................................................................... 59
Figure 4-2. Incident Arrivals – Probability Plot. .............................................................. 60
Figure 4-3. Resolution Times - Histograms. ..................................................................... 61
Figure 4-4. Resolution Times – Probability Plots. ............................................................ 62
Figure 4-5. Times to Escalate - Histograms...................................................................... 63
Figure 4-6. Times to Escalate – Probability Plots............................................................. 64
Figure 4-7. Autocorrelation Function of Downtime Residuals. ....................................... 67
Figure 4-8. Autocorrelation Function of Waiting Time Residuals. .................................. 68
Figure 4-9. Cross Correlation Function of Waiting Time and Downtime Residuals. ...... 69
xii
List of Tables
Table 2-1. ITIL Framework Modules and Core Services. ................................................ 16
Table 2-2. Relevant Research on ITIL Implementation. .................................................. 24
Table 4-1. Subset of IM Requirements, Activities, Main CSFs and
Associated KPIs based on the ITIL Framework (Axels, 2011). ....................................... 53
Table 4-2. Performance Summary of the Commercial Bank’s IM
Service Implementation. ................................................................................................... 56
Table 4-3. Summary of the Fitted Probability Distributions. ........................................... 58
Table 4-4. Specific Target KPIs........................................................................................ 65
Table 4-5. Implementation Improvement Results using the Proposed MBSE
Approach. .......................................................................................................................... 66
Table 4-6. Adherence to the Target KPI Values using the Proposed MBSE. .................. 71
xiii
List of Acronyms
IT Information Technology
ITIL Information Technology Infrastructure Library
MBSE Model-based Systems Engineering
SysML System Modeling Language
SDLC System Development Life Cycle
SE Systems Engineering
EM Engineering Management
CMDB Configuration Management Database
CSFs Critical Success Factors
INCOSE International Council on Systems Engineering
OMG Object Management Group
IM Incident Management
KPIs Key Performance Indicators
MoEs Measures of Effectiveness
1
Chapter 1—Introduction
1.1 Background
Information Technology (IT) functions are an integral enablement component in
any modern organization. Organizations rely on IT to automate their operations, improve
product and service quality, manage various risks, and support sound decision making.
To this end, IT functions enable organizations to develop and maintain business products
and services, which is realized through a wide range of well-established and widely used
best practices, including System Development Life Cycle (SDLC), Agile methods, ISO
20000, and the Information Technology Infrastructure Library (ITIL). Figure 1-1 depicts
the classification of these best practices in terms of their usage for either development or
maintenance purposes.
Figure 1-1. IT Enablement of Business Functions.
2
1.2 Information Technology Infrastructure Library
Established in late 1980s, ITIL was introduced by the UK government to help
maintain IT service delivery. The formal definition of an IT service, as per ITIL, is a
means of delivering value to the businesses of customers without the customer bearing
the associated costs and risks. IT services involve technology and fulfill the needs raised
by the customer by enabling him or her to produce a particular business outcome (Axels,
2011).
One important aspect of ITIL is the recognition of roles in an organization. When
employees understand their roles and responsibilities, business processes can be easily
followed, communication is enhanced, and work is ultimately accomplished. The
delivery of services will also entail having what are known as Service Level Agreements
or SLAs. These agreements are the commitments that the business has to the customers,
both internal and external to the organization. The details of an SLA represent the
parameters of that service provided, such as service availability.
By definition, ITIL is a collection of processes that aims for continuous IT service
improvement while focusing on improving quality, reducing costs, managing risk, and
enhancing both the efficiency and effectiveness of IT services. ITIL has five modules and
several processes allocated to each one of these modules. The modules with their
respective processes are depicted in Figure 1-2.
3
Figure 1-2. ITIL Modules and Processes (Axelos, 2011).
ITIL is the framework that is universally adopted to manage IT services (Iden &
Eikebrokk, 2014). Since the first release of ITIL, it has evolved in its capacity to provide
up-to-date best practices to support the IT service management in any organization,
regardless of its size or business domain (Axels, 2017). More organizations are seeking to
implement ITIL owing to its nonproprietary nature.
Past research has shown that process standardization initiatives such as ITIL are
“a driver of performance improvements in terms of cost, time, efficiency, effectiveness,
quality, and responsiveness” (Romero et al., 2015, p. 266). Successful ITIL
implementation initiatives in organizations are associated with reducing the occurrences
and recurrences of IT incidents and problems, increasing the IT service quality,
improving customer satisfaction, and reducing the total cost of IT service ownership. All
of these outcomes allow customers to gain confidence in IT services and organizations to
enhance their overall productivity and hence their return on investment (Sebaaoui &
4
Lamrini, 2012; Mikaelian et al., 2011).
Although the management of the ITIL implementation is described as complex
and challenging, organizations continue to implement ITIL without extensive
assessments of its impacts on their workplaces (Silva et al., 2017). Organizations tend to
underestimate the effort, duration, risk, and cost of ITIL implementation and choose to
proceed with traditional implementation approaches that are heavily dependent on
documentation (Pereira & Silva, 2011; Iden & Eikebrokk, 2014; AlShamy et al., 2012).
Organizations may also encounter challenges if the implementation of ITIL initiatives is
not properly managed (Marrone & Kolbe, 2011). These challenges are associated with
internal and interdepartmental communications in addition to the identification and
involvement of key stakeholders, the assumption that ITIL will be readily implementable
from the start with minimal need for the customization or tailoring of existing processes,
and the expectation that the return on investment will be immediate (Iden & Eikebrokk,
2014; Müller & Lichtenberg, 2018).
1.3 Systems Engineering and Engineering Management
Systems engineering (SE) is the field of engineering that enables the successful
implementation of complex and challenging initiatives (International Council on Systems
Engineering, 2011). Both SE and engineering management (EM) advocate approaches
that are similar in identifying what engineering activities need to be performed, but they
are different in how these activities are carried out (Farr & Buede, 2003).
While SE focuses on the early phases of product and service implementation, EM
oversees the overall lifecycle of such products and services. SE includes the planning,
design, execution, control, and closure phases, while the responsibilities of EM extend to
5
the financial and resource management of engineering initiatives (Walden et al., 2015).
SE employs approaches that are not intrinsic to those of EM, yet those approaches,
including systems thinking, trade-off analysis, modeling and simulation, and prototyping
(Locatelli et al., 2014), support engineers in the successful management of their work. An
established SE approach that encapsulates all of these similarities and differences is
known as model-based systems engineering (MBSE) (Walden et al., 2015; Ramos et al.,
2012; Bjorkman et al., 2012).
1.4 ITIL and Systems Engineering
Systems engineering is the process of applying frameworks, techniques, and tools
to the development of systems in general. The processes of systems engineering are
associated with the stages or phases of a system life cycle. According to Walden et al.
(2015), these processes are Agreement Processes, Project Processes, Technical Processes,
and Evaluation Processes. Figure 1-3 depicts these four processes in addition to the
underlaying subprocesses.
6
Figure 1-3. Systems Engineering Processes (Walden et al., 2015).
By investigating the processes in both ITIL and systems engineering, one can
identify a great deal of similarities and differences between the two (Figure 1-4).
Processes such as Project Portfolio Management, Configuration Management, Supply
Management, and Operations Management are among the similar ones. These are
highlighted in green in Figure 1-4. On the other hand, processes such as Verification,
Validation, Human Resource Management, and Acquisition Management exist in
systems engineering but may not be obvious in the ITIL framework, as highlighted in red
in Figure 1-4. The processes of systems engineering that are considered to complement
and support the implementation of ITIL (shown in blue in Figure 1-4) are the subject of
investigation in this research.
7
Figure 1-4. Process Similarities and Differences between ITIL and Systems
Engineering.
1.5 Model-Based Systems Engineering
MBSE is an engineering approach for modeling systems and processes that are
complex in nature and involve the integration of hardware, software, and interactions
with humans and organizations (Motamedian, 2013). MBSE is “the formalized approach
of modeling to support systems requirements, design, analysis, verification and validation
activities that begin in the conceptual and later cycle phases” (Crisp, 2007, p. 15). The
main purpose of using MBSE is to facilitate the delivery of products and services while
managing the associated risks, improving the solution quality, and supporting decision-
making practices (Ramos et al., 2012; Bjorkman et al., 2012; Lima et al., 2018).
One of the main benefits of employing MBSE in managing implementation
initiatives is its reliance on the modeling and simulation of the “to-be” state compared
8
with the traditional document-based implementation approaches (Izukura et al., 2013;
Orta et al., 2014; Tsadimas et al., 2016). MBSE supports managers with the means to
develop implementation alternatives that are needed to carry out the activities presented
in implementation documents before costs are incurred and efforts are expended (Sharon
et al., 2013).
1.6 Research Motivation
The main motivation for conducting this research is to enable IT project managers
to leverage the proven methodologies and tools of systems engineering to better deliver
ITIL implementation projects. Specifically, the research is motivated by MBSE’s features
of enhancing interorganizational communication, improving collaboration among project
stakeholders, and improving decision-making at every project management phase. These
features are among the project manager’s concerns who, essentially, is limited by the
project scope, budget, and delivery timeline.
1.7 Problem Statement
The ITIL framework is widely used to manage the strategy, design, transition,
operation, and continual improvement of IT services. While the ITIL framework itself
has undergone numerous revisions and refinements, successfully managing ITIL
implementation within organizations is challenging due to several limitations. These
limitations are associated with managing collaboration and communication within
organizations, meeting stakeholder requirements and service quality objectives, managing
risk, and practicing effective decision making. Although modeling approaches have
generally been used to analyze ITIL implementation, such approaches tend to focus on
individual ITIL modules or a specific implementation limitation as opposed to the entire
9
ITIL framework.
1.8 Thesis Statement
This research suggests that the use of MBSE enhances the implementation of ITIL
in terms of its service strategy, design, transition, operation, and continual improvement.
These enhancements are specifically captured by modeling the service requirements,
design and simulation of service behaviors with the main objective of developing
recommendations for efficient ITIL implementation for a given organization.
1.9 Research Objectives
We propose the use of MBSE, which has been shown to provide benefits such as
improved collaboration among stakeholders, enhanced decision-making practices,
reduced operational risk, and an improved quality of service to organizations. Because
MBSE spans the entire life cycle of products and services, it has the potential to
holistically improve the implementation of ITIL across an organization (Walden et al.,
2015; Ramos et al., 2012; Bjorkman et al., 2012). This research proposes an MBSE
approach for ITIL implementation that results in improvements to customer satisfaction
and service availability and efficiency. Our MBSE approach utilizes the general-purpose
SysML. The proposed SysML-based ITIL implementation is also augmented with
simulations to validate improvement recommendations for a real-world use case.
1.10 Research Questions and Hypotheses
To understand the research questions and hypotheses, some clarification of
terminology is necessary (Axels, 2011):
1. Customer Satisfaction: Customer satisfaction is associated with responding to the
customer as quickly as possible while minimizing the impact to the business. To
10
measure customer satisfaction, the mean waiting time is used. Waiting time is
defined as the time from when a customer contacts the service desk until a service
agent acknowledges and logs the customer complaint.
2. Risk: Risk is associated with maintaining the availability of IT services. The
downtime is used to measure such availability and defined as the total time when
an IT service is not available. It is often calculated from the time when a customer
contacts the service desk until the incident is fully resolved.
3. Efficiency: Efficiency is related to categorizing, escalating, and processing
incidents. The service time before escalation is used to measure the efficiency.
Escalation happens when a lower-level support agent forwards an incident to the
next immediate higher support level.
Hence, the research questions are:
RQ1: Does employing MBSE increase customer satisfaction in ITIL
implementation?
RQ2: Does employing MBSE reduce the risk resulting from ITIL
implementation?
RQ3: Does employing MBSE increase the efficiency in ITIL implementation?
The research hypotheses are:
1. Waiting time
Ho1: MBSE’s mean waiting time is more than the target mean waiting
time
Ha1: MBSE’s mean waiting time is less than or equal to the target mean
waiting time
11
2. Downtime
Ho2: MBSE’s mean downtime is longer than the target mean downtime
Ha2: MBSE’s mean downtime is shorter than or equal to the target mean
downtime
3. Time to escalate
Ho3: MBSE’s mean time to escalate is different than the target mean time
to escalate
Ha3: MBSE’s mean time to escalate is not different than the target mean
time to escalate
1.11 Scope of Research
In this research, an MBSE approach is proposed to support the management of
ITIL implementation initiatives and provide a means for predicting performance results
from a proposed implementation. In particular, this research explores how MBSE can be
used to improve customer satisfaction and provide desired levels of service availability
and efficiency targets based on an organization’s capacity and capabilities. This research
also explores ways to evaluate alternatives at the requirement definition phase, leveraging
MBSE to assess the planned ITIL implementation and determine whether the planned
implementation will meet organizational requirements.
1.12 Research Limitations
This research is limited by a number of boundaries. First, ITIL implementations
are considered internal projects to organizations and cover a great deal of organizational
capabilities and practices. Hence, data is constrained by the publicly available sources
which, in turn, are scarce when actual ITIL measurements are sought. Second, while a
12
number of MBSE modeling languages exist, this research is limited to using SysML
which is the most widely used MBSE modeling language (Locatelli et al., 2014).
Additionally, SysML in addition to its common use is readily available and supported by
many commercially available software packages. Third, the proposed MBSE approach
was validated through the analysis of a publicly available dataset and has not yet been
implemented for a new ITIL implementation initiative. Real world validation is
recommended as an area for future research.
1.13 Organization of Praxis
This praxis is organized as follows. Chapter 2 provides a summary of the existing
literature on the challenges encountered during ITIL implementation and a discussion on
how these challenges can be addressed using MBSE. In Chapter 3, an introduction to the
methodologies for addressing these challenges using MBSE is provided. Chapter 4
focuses on employing this methodology for a specific use case and evaluates the benefits
of adopting MBSE when implementing ITIL in a commercial bank. Finally, Chapter 5
provides concluding remarks and suggests further directions for this research.
13
Chapter 2—Literature Review
2.1 Introduction
The ITIL, with its latest release of version 3, is recognized as the most widely
adopted IT service management framework (Iden and Eikebrokk 2013; Pereira and Silva
2010; Iden and Eikebrokk 2014b). More organizations are seeking to implement the ITIL
due to its non-proprietary nature. The ITIL’s standardization approach is “a drive of
performance improvements in terms of cost, time, efficiency, effectiveness, quality and
responsiveness” (Romero et al. 2015, p. 266). Similar to the general systems engineering
approach, the ITIL implementation follows a life cycle approach with five core modules:
service strategy, service design, service transition, service operation, and continual
service improvement (Ahmad and Shamsudin 2013).
Since its first release, the ITIL has evolved to provide an up-to-date best practices
framework to specifically support the employment of IT service management in any
organization regardless of its size or business domain. The successful implementation of
the ITIL is associated with reducing the occurrences and recurrences of IT incidents and
problems, increasing IT service levels and improving customer satisfaction, and reducing
the total cost of IT asset ownership. All such outcomes result in customers gaining
confidence in IT services and an organization enhancing its overall productivity and
hence profitability (Sebaaoui and Lamrini 2012).
There are, however, potential challenges that an organization can face if the ITIL
implementation is not performed correctly (Marrone and Kolbe 2011). Gacenga et al.
(2010) concluded via an ITIL assessment that the implementation varies significantly
among organizations. Some organizations chose not to implement all five ITIL modules,
14
and the majority of them selectively implemented only the change management and
incident management processes. Clearly, the partial implementation of selected modules
and processes will not yield the same value to an organization as a full implementation
would. This selective partial implementation is attributed to a number of factors
(Cronholm and Persson 2016). These factors including challenges of internal and
interdepartmental communications within an organization, the identification and
involvement of key stakeholders, the assumption that the framework will fit from the start
with minimum customization, and the expectation that the value realization will be
immediate (Iden and Eikebrokk 2014b).
Although the ITIL implementation is described as being complex and
challenging, organizations continue to resort to the general ITIL best practices with little
exploration of the impact of such implementation on the workplaces of those
organizations (Pereira and Silva 2010). The literature shows that some organizations
underestimate the duration, risk, and cost of ITIL implementation and the effort required
and choose to proceed with a document-heavy implementation, with little attention paid
to what modeling and simulation reveal about this type of implementation (Pereira and
Silva 2010; Iden and Eikebrokk 2014b; AlShamy et al. 2012).
Some research has been conducted to study factors that influence the success of
ITIL implementation. Lema et al. (2015) suggested that the implementation order of the
ITIL modules does not need to be the same for all organizations and that organizations
should start with the modules that represent quick wins. Chan et al. (2009) implied that
measuring the implementation quality is essential in determining the success of the
implementation and suggested aligning the ITIL with Six Sigma to improve such
15
implementation. Others combined the ITIL with different IT frameworks, such as
COBIT, ISO/ISE 27001, and eTOM, to support the implementation effort (Pillai et al.
2014; Denda and Drajic 2013; Sahibudin et al. 2008).
Modeling that supports specific ITIL service implementation is explored in the
literature. The requirements for designing an ITIL configuration management database
(CMDB), for instance, were investigated using a model-driven architecture via the UML
at the requirement management stage (Jelliti et al. 2010). Orta and Ruiz (2014) presented
a modeling effort to support the decision-making process in the IT service strategy
module of the ITIL. Others used a customized and proprietary modeling tool to analyze
the IT incident management service (Bartolini et al. 2008). Izukura et al. (2011)
developed an in-house tool to evaluate the requirements and performance of IT hardware
systems using SysML. A mathematically based business-driven model for understanding
and capturing the business value and quality of IT services was presented by Lima et al.
(2012). Simulation, on the other hand, has received less research attention in the
literature. A recent literature review revealed only individual cases of certain ITIL
services that were supported by simulation efforts before and during service
implementation (Manoel et al. 2017).
In this chapter, the literature relevant to the challenges that managers encounter
when implementing ITIL in organizations is reviewed. Furthermore, this review
summarizes how SE in general and MBSE in particular can support the implementation
of complex IT initiatives. The gaps associated with the cited challenges and the
expectations of the contributions MBSE can make to address these challenges are
outlined in the context of managing the implementation of ITIL initiatives.
16
2.2 Challenges in Managing the Implementation of ITIL Initiatives
ITIL, a trademark of Axelos (Axels, 2011), is recognized as the most extensively
adopted IT service management framework (Pereira & Silva, 2011; Iden & Eikebrokk,
2014; Iden & Eikebrokk, 2013). Although other frameworks, such as ISO 9001, ISO/IEC
15504, ISO/IEC 20000, CMMI-SVC and COBIT, can be utilized to improve IT service
management and operations, ITIL is still the de facto framework around the globe (Diirr
& Santos, 2014; Eikebrokk & Iden, 2016). This is supported by agreement in the
literature that ITIL achieves benefits, especially in operationalizing continuing service
improvements that are not readily realized by other frameworks in the field of IT service
management (Eikebrokk & Iden, 2016). ITIL has 26 IT services organized as 5 modules:
service strategy, service design, service transition, service operations, and service
continual improvement (Axels, 2011). These modules and services are listed in Table 2-
1.
Table 2-1. ITIL Framework Modules and Core Services.
ITIL service module Core service
Service Strategy
Demand Management
Financial Management
Service Portfolio Management
Risk Management
Service Design
Availability Management
Capacity Management
17
IT Service Continuity Management
Service Catalog Management
Service Level Management
Service Provider (Supplier) Management
Service Transition
Transition Planning and Support
Change and Evaluation Management
Knowledge Management
Release and Deployment Management
Service Asset and Configuration Management
Service Validation and Testing
Service Operation
Access Management
Event Management
Incident Management
Problem Management
Request Management
Continual Service
Improvement
Identify and Deliver Service Improvement
Service Measurement and Performance
Management
18
A recent benchmarking conducted by Axelos revealed several challenges
encountered during and after ITIL implementation initiatives. These challenges include a
lack of visibility for implementation teams, in insufficient understanding of customer
needs, and a lack of stakeholder collaboration (Axels, 2017). The benchmarking
highlighted that compartmentalized implementation, both functionally and servicewise, is
one of the major challenges a project manager faces when managing an ITIL
implementation regardless of the organization’s size (Axels, 2017). With more than 2,000
pages of ITIL framework documentation in 5 reference manuals, managing an ITIL
implementation project and defining its scope could be intimidating, and the need to
utilize an implementation approach that is manageable, traceable, scalable, and verifiable
is paramount (Cronholm & Persson, 2012).
Researchers have described other challenges that an organization faces if the
implementation of ITIL is undermanaged (Marrone & Kolbe, 2011). Gacenga et al.
(2010) concluded in an ITIL assessment that its implementation varies significantly
among organizations. In another investigation, many organizations chose not to
implement all five ITIL modules or selectively implemented only change management
and incident management services (Marrone & Kolbe, 2011). Clearly, the partial
implementation of selective modules and/or services will not yield the same value to an
organization as the full implementation of the framework (Marrone & Kolbe, 2011;
Bernard, 2014). In a noticeable number of cases, it was assumed that the framework
would be immediately implementable with a minimum number of required
customizations, and there was an expectation of an immediate return on investment (Iden
& Eikebrokk, 2014).
19
Researchers also attributed ITIL implementation challenges to organizational
factors, such as a lack of support from higher management, low levels of education and
awareness among staff regarding the implementation of projects, and poor training on
ITIL services, in addition to a failure to disseminate and share information on the
implementation of ITIL projects (Iden & Eikebrokk, 2014; Iden & Eikebrokk, 2014).
Some challenges specific to the initiatives themselves include management
methodologies, decision-making processes, quality management, and risk management
(Lema et al., 2015).
Several researchers have suggested that the implementation sequence of ITIL
modules does not need to be the same for all organizations and that organizations should
start with the modules that present quick wins (Pereira & Silva, 2011; Orta et al., 2014;
Nicho & Mourad, 2012; Ahmad & Shamsudin, 2013; Orta & Ruiz, 2014; Valverde &
Talla, 2014; Pillai et al., 2014; Lima et al., 2012). All of these challenges are typically
encountered during and after ITIL implementation and can result in the suspension of the
implementation or, in many cases, missed deadlines and/or budget overruns (Pereira &
Silva, 2011).
2.3 Using Modeling and Simulation to Support ITIL Implementation Initiatives
The modeling and simulation approach provides organizations with the ability to
predict the behaviors of IT services and demonstrate their desired performance levels
(Carley, 1994; Soo-Haeng & Eppinger, 2005). Typically, this ability should support
assessments of customer satisfaction in addition to service availability and efficiency
(Orta et al., 2014; Orta & Ruiz, 2014; Valverde & Talla, 2014). Modeling and simulation
20
also support the discovery and explanation of service behaviors, which may occur during
unexpected operations (Carley, 1994; Manoel et al., 2017; Bartolini et al., 2008).
Modeling that supports the implementation of specific ITIL services has been
explored in the literature. For instance, the requirements for designing an ITIL
configuration management database (CMDB) were investigated using a model-driven
architecture at the requirements management stage (Jelliti et al., 2010). Orta and Ruiz
(2014) presented a model to support the decision-making process in the IT service
strategy module of ITIL.
Other researchers used customized and proprietary modeling tools to analyze IT
incident management services (Bartolini et al., 2008). Izukura et al. (2011) developed an
in-house modeling tool to evaluate the requirements and performance of IT hardware
service management. A mathematically based business-driven model was also presented
by Lima et al. (2012) to understand and capture the business value and quality of IT
services. Simulation has also received research attention in the literature. For example, a
recent literature review revealed cases of specific ITIL services that were supported by
simulation efforts both before and during service implementation (Manoel et al., 2017).
A number of ITIL-specific performance measures have been defined and used in
the literature. These measures are referred to as critical success factors (CSFs) (Aire et
al., 2011). Generally, these factors are classified into several categories, including
implementation management, communication, organization-related aspects,
measurements, and tools. Some specific CSFs include incremental service
implementation, service prioritization, quality- and risk-driven implementation, and
performance measures for all ITIL services (Nicho & Mourad, 2012).
21
2.4 Using SE to Support the Management of Complex and Challenging Initiatives
SE is the field of engineering that enables the successful delivery of complex and
challenging initiatives (International Council on Systems Engineering, 2011). Locatelli et
al. (2014) presented SE tools and approaches that are relevant to supporting initiatives’
managers. Among the presented approaches are systems thinking, trade-off analysis,
requirement management tools, and modeling and simulation. Zhu and Mostafavi (2017)
built on the work of Locatelli et al. (2014) and used the SE system-of-systems (SoS)
approach to assess the performance of complex IT initiatives.
Researchers have investigated how SE approaches may be employed in
monitoring and controlling IT initiatives and attempted to model uncertainty behaviors
using system dynamics. Ahlemann (2009) proposed a reference model to support the
acceleration of the management of IT initiatives. To assess the performance of IT
initiatives, Ebner et al. (2016) used a design theory based on strategic IT benchmarking.
Gelbard et al. (2002) presented a model that integrated both systems analysis and
initiative management and mapped data flow diagrams to Gantt diagrams.
MBSE is an SE approach that aims to create a digital model of a given system or
process. The International Council on Systems Engineering (INCOSE) defines MBSE as
an interdisciplinary approach used to enable the realization of successful initiatives
(Walden et al., 2015). The evolving complexity of initiatives calls for the implementation
of MBSE (Tsadimas et al., 2016; Nikolaidou et al., 2015; Nikolaidou et al., 2016). Such
complexity is reflected by increasingly detailed and integrated requirements and
interactions that are challenging (both internally and externally) to the initiative
(Bjorkman et al., 2012; Motamedian, 2013), in addition to competing needs of
22
stakeholders — especially when they are geographically dispersed (Overhage et al.,
2010).
MBSE has been used extensively to manage both engineering and non-
engineering initiatives. According to one survey, MBSE is used to manage initiatives in
the defense, automotive, space systems, and training and consulting domains with
different levels of awareness and initiative management practices (Motamedian, 2013).
MBSE has also been applied to support specific initiative management activities such as
requirement management, design, verification and validation (Bjorkman et al., 2012),
architecture and trade analysis, operational analysis and management, and product life
cycle management (Sharon et al., 2013).
As with any modeling approach, MBSE relies on the proper use of a modeling
language. A number of MBSE languages have been discussed in the literature, including
the Systems Modeling Language (SysML), object-process methodology (OPM), object-
oriented SE method (OOSEM), rational unified process for SE (RUP-SE, developed by
IBM), and the Vitech MBSE modeling language (Ramos et al., 2012). SysML was
developed by the Object Management Group (OMG) as an extension of the universal
modeling language (UML). Similar to UML, SysML uses various diagram types, as
depicted in Figure 2-1. These diagram types include package, requirements, block,
behavior, and parametric diagrams. SysML is a “general-purpose modeling language that
is intended to support many different modeling methods, such as structured analysis and
object-oriented methods” (Walden et al., 2015, p. 188). One of the unique benefits of
SysML is its capacity to improve communications among teams and stakeholders using a
set of standard diagrams in a single model (Locatelli et al., 2014).
23
Figure 2-1. Standard SysML Diagrams (Walden et al., 2015).
2.5 MBSE and Managing ITIL Implementation Initiatives
As discussed earlier, ITIL implementation initiatives are faced with challenges
that may adversely impact their success. Although these challenges are understood and
have been researched, the literature offers few recommendations that managers can
follow to collectively overcome these challenges. The literature also indicates that SE
approaches empower project managers to deliver complex initiatives beyond the
traditional success criteria within the budget and scope. However, there are few
references in support of leveraging SE approaches to manage the implementation of ITIL
initiatives. Despite its many benefits to project managers, it is evident that MBSE has not
been extensively applied to manage the implementation of ITIL projects.
Based on these findings, we determined that an exploration of novel approaches
using MBSE to manage ITIL implementation initiatives is a promising area in need of
further investigation. In our research, we examined how the use of MBSE can enhance
the implementation of ITIL in terms of its service strategy, design, transition, operation,
and continual improvement. These enhancements are specifically captured by modeling
24
the service requirements, design and simulation of service behaviors with the main
objective of developing recommendations for efficient ITIL implementation for a given
organization. Figure 2-2 depicts the gap in the literature we discovered defined by the
intersection of the existing literature on implementation approaches to MBSE and ITIL.
Additionally, Table 2-2 shows a summary of relevant research addressing ITIL’s
implementation limitations.
Figure 2-2. Venn Diagram Depicting the Literature Gap in Using MBSE for ITIL
Implementation.
Table 2-2. Relevant Research on ITIL Implementation.
Author and
year
Research contribution to
ITIL implementation
Further research
recommendation
Iden and
Eikebrokk
(2013)
Conducted a literature
review on ITIL
implementation and its
critical success factors,
Suggested further research
on how ITIL
implementation can enable
alignment, governance,
communication, and
25
outcomes, benefits, and
performance management.
knowledge management in
organizations.
Pereira and
Silva (2010)
Showed through a
questionnaire that ITIL
implementations are
inconsistent with the best
practices and proposed a
maturity model to improve
such implementations.
Recommended designing a
maturity model that assists
organizations in self-
assessing their ITIL
implementations.
Iden and
Eikebrokk
(2014)
Presented a survey study
that showed the
relationship between ITIL
implementation success
and the efficient
involvement of
stakeholders.
Suggested a more extensive
study of the nature of ITIL
services and how to group
them for successful
implementation.
Romero et
al. (2015)
Presented two separate
simulations of capacity
management and incident
management to support
decision-making.
Suggested more integrated
ITIL modeling with the use
of multi-objective
optimization.
26
Ahmad and
Shamsudin
(2013)
Proposed human and
technological critical
success factors that
improve ITIL
implementations using a
survey.
Recommended
understanding and
analyzing the specific
organizational contexts
before embarking on ITIL
implementation initiatives.
Sebaaoui
and Lamrini
(2012)
Motivated by general
project management
practices, they proposed an
approach to implement
ITIL services.
Recommended the
inclusion of stakeholder
management and staff
awareness in their proposed
approach.
Marrone
and Kolbe
(2011)
Based on viewpoint of IT
experts, they provided an
empirical study correlating
ITIL implementation levels
and maturity with the
benefits realized in
organizations.
Proposed extending their
study to cover viewpoints
of business users and the
overall business-IT
alignments’ correlation
with ITIL implementation
maturity.
Gacenga et
al. (2010)
Conducted a field study on
ITIL implementations and
concluded that
performance management
Suggested expanding their
study to further define and
analyze specific goals and
metrics that should be
27
should be incorporated
with each implementation.
incorporated within ITIL
implementations.
AlShamy et
al. (2012)
Investigated the
organizational culture
impact on ITIL
implementation through a
field study.
Recommended the use of
modeling to capture other
factors impacting the ITIL
implementation.
Lema et al.
(2015)
Conducted a survey to
investigate the most used
ITIL implementation
sequences.
Proposed expanding their
study to a broader sample
to understand the
interconnections between
process improvement and
ITIL implementation.
Chan et al.
(2009)
Explored how Six Sigma
can supplement ITIL
implementation in meeting
management’s quality
objectives using a
qualitative study.
Recommended conducting
quantitative studies to
demonstrate the value Six
Sigma brings to ITIL
implementation.
Denda and
Drajic
(2013)
Incorporated eTOM with
ITIL incident management
service in an international
Suggested similar
integrations with other ITIL
services.
28
tele-communication
project.
Sahibudin et
al. (2008)
Presented a combined
framework of ITIL,
COBIT, and ISO 27002
with potentially higher
value compared to the
individual frameworks.
Recommended the
implementation of their
combined framework in
real-world case studies.
Jelliti et al.
(2010)
Demonstrated integration
between services in the
ITIL service operations
module with the CMDB
via a model-driven
architecture.
Proposed extending the
model to integrate other
ITIL modules with the
CMDB.
Orta and
Ruiz (2014)
Presented a dynamic model
to evaluate the ITIL
strategy fulfillment goals.
Recommended the
development of simulation
models to support decision-
making for different ITIL
services.
Lima et al.
(2012)
Offered an estimation
model to capture both the
value and quality in the
context of the ITIL
Suggested the addition of
risk and decision-making
and linking their model to
29
continual service
improvement module.
the organization’s balanced
scorecard model.
2.6 Using MBSE to Address ITIL Implementation Challenges
The ITIL framework describes a set of service requirements with significant
details that can be encapsulated within MBSE during the definition stage (Axels, 2011).
In addition, MBSE can accommodate organization-specific requirements such as business
models, project management, finances, resources, and operational requirements.
Stakeholders’ views are captured via SysML to reflect their perspectives regarding how
ITIL should be implemented and operated in their organization. Using an MBSE
approach, these views can be developed iteratively, and refinements can be made based
on the overall organizational view of the ITIL service of interest. Eventually, the
organizational view of how a service functions is derived from individual views of the
contextual, conceptual, logical, physical, and actual representations (Fatolahi & Shams,
2006).
Each service has functional requirements derived from the ITIL framework,
including how the service should operate and how it can be continually improved. During
the early stages of considering the architecture and service requirements, MBSE
facilitates how the details of the service implementation may be fully understood, and
potential implementation risks can be defined at high levels. Decisions regarding the ITIL
service design, implementation, and operations must be included in the definition of the
initial requirements, but they do not need to be detailed.
30
The proposed MBSE approach ensures that each risk and decision point is
associated with an individual service in isolation from the remaining ITIL modules in
addition to the performance requirements of the service. Eventually, as services are
integrated within their respective ITIL modules, the overall requirements are defined,
including the definitions of risks, decision points, and service performance.
Using MBSE at the start of an ITIL implementation initiative enables each
stakeholder to contribute relevant knowledge to that initiative and iteratively improve the
quality of such knowledge. The resulting model-based representation eventually contains
all of the knowledge needed to start the initiative while engaging the associated
stakeholders.
Additionally, MBSE has an intrinsic capability to rationalize and streamline the
feedback process provided by various stakeholders during the ITIL model-based
implementation from start to finish. Hence, critical decisions during the service life cycle
are made based on agreed-upon assumptions and facts. Real-world data can be collected
and fed into the service model to support additional refinements if changes need to be
made. Contrary to many current ITIL implementation approaches, verifying and
validating the service design and development are model-based processes that can occur
at any time during the IT service life cycle.
With MBSE, an ITIL implementation is no longer limited to certain modules or a
specific service, nor is it required to follow the standard implementation sequence from
service strategy to service continual improvement. Instead, ITIL modules and services
can be managed concurrently, and the effects of specific and isolated local decisions can
be linked to other services and to the overall implementation. Similarly, risks that are
31
accepted for a specific service or module are easily investigated for other services. Other
valuable model-based benefits, including cost reduction, improved service quality, and
enhanced process management, are supported when employing MBSE for ITIL
implementation.
In our proposed approach, we start by modeling the ITIL framework using MBSE
to capture the overall set of ITIL services (Figure 2-3). The input to this step is the set of
ITIL framework v3 manuals (Axels, 2011). These manuals explain each ITIL service in
complete detail. For each ITIL service, the manuals describe the requirements, design,
measurements, and continuous improvement steps. Our MBSE approach uses SysML
diagrams and artifacts and creates a digital representation of the ITIL framework. For
example, the structure of the ITIL framework is modeled using package and block
diagrams, the service requirements are modeled using requirement diagrams, the service
design is modeled using block, activity, and sequence diagrams, and the service
performance measurements are modeled using parametric diagrams.
32
Figure 2-3. The Proposed MBSE Approach for ITIL Implementation.
33
Chapter 3—Methodology
3.1 Introduction
With the literature gap identified in Chapter 2, the research question that this
study attempts to answer is how MBSE can be used to support the planning and
management of ITIL implementation initiatives. To attempt to answer this research
question, a set of specific research hypotheses are formulated and tested (Chapter 4)
based on real-life measurements. Based on the earlier discussion of the benefits of
MBSE, one should also consider the cited intrinsic value MBSE brings to an organization
in terms of risk management, quality improvement, decision-making support practices,
and improving the collaboration and communication within an organization in addition to
the management of stakeholders’ requirements.
3.2 Using MBSE to Model ITIL
To support our investigation, the standard ITIL v3 framework is modeled using
MBSE. The definitions, requirements, designs, and performance measurements of
services in addition to the interrelationships among the services are captured in the
proposed model using SysML. Real-life measurements from a commercial bank (Chapter
4) are then used to statistically test the validity of the proposed model in two phases.
First, the proposed MBSE approach is validated as to whether it generates an adequate
representation of the current ITIL implementation of the bank. Second, when capturing
additional specific business requirements, such as a target mean time to acknowledge a
customer’s complaint, the model’s ability to generate results that meet the additional
requirements is determined. This research hypothesizes that employing MBSE to model
34
ITIL implementation leads to improvements in fulfilling management’s requirements.
The SysML code is created using MagicDraw’s Cameo and simulated using MATLAB.
The initial stage of employing MBSE to manage the implementation of ITIL
initiatives is to organize the model to essentially map ITIL modules onto the SysML
package and block diagrams and to provide additional elements to support an
understanding of the model, change control, and reusability. The MBSE organization is
structured into packages, where each package contains the model elements representing
the ITIL artifacts. The stakeholder packages are included to capture the views of each
stakeholder involved in the ITIL implementation. The model’s architecture facilitates an
iterative approach in capturing each stakeholder’s view, requirements, and specific model
results. Each structural hierarchy is further decomposed into detailed model elements. In
our MBSE approach, all five ITIL modules and services are available and are thus
modeled.
The implementation project manager identifies the project stakeholders and uses
the ITIL framework reference to create a SysML representation of the project structure,
as depicted in Figure 3-1. In this figure, the overall ITIL implementation context includes
the five ITIL modules, the core services in each module, and the stakeholders in a block
diagram. The constraints and parts of each service are also captured in this initial view of
the model. All stakeholders have access to this model and can recommend changes.
These changes are controlled and managed via model versioning and communicated to
the project teams. Because every project team member and stakeholder has access to this
digital model, the need to rely on document-based communications and changes, as with
traditional ITIL implementation approaches, is eliminated.
35
Figure 3-1. Overall ITIL Implementation Project Structure Using SysML.
36
The proposed MBSE model is organized to essentially map the ITIL modules and
to provide additional elements to support the model’s understandability, change control,
and reusability. The MBSE organization is depicted in packages (Figure 3-2), with each
package containing the model elements that represent the ITIL artifacts. The stakeholder
package is included to capture the viewpoint of each stakeholder involved in the ITIL
implementation. The model architecture, provided in Figure 3-3, facilitates an iterative
approach to capturing each stakeholder’s viewpoint, requirements, and specific model
results.
Figure 3-2. Proposed MBSE Organization.
37
Figure 3-3. Proposed MBSE Architecture Framework
Each structural hierarchy is further decomposed into detailed model elements. For
example, the service strategy module hierarchy includes demand management, financial
management, risk management, and service portfolio management, as presented in the
block diagram shown in Figure 3-4.
38
Figure 3-4. Block Definition Diagram of ITIL Service Strategy Module.
To further demonstrate the proposed MBSE structure, consider the demand
management (DM) service of the ITIL service strategy module. DM is the service that
enables an organization to identify their customers’ actual needs and deliver services and
determine the usage levels of the monetized services. The main goal of DM is to calibrate
the service supply to customers’ demands, in the optimal manner, given the resources that
are available to an organization. This goal requires analyzing customers’ behaviors and
predicting future demands and the corresponding capacity. The requirement definitions as
per the ITIL framework for the DM service are presented in the requirements table in
Figure 3-5, and the activity diagram in Figure 3-6 illustrates the demand management
service design.
39
Figure 3-5. Requirement Definitions for the Demand Management Service.
Figure 3-6. Activity Diagram for Demand Management Service.
The proposed MBSE supports the modeling of trade studies to assess specific
design criteria. The ITIL v3 framework provides a set of standard CSFs and key
performance indicators (KPIs) for the DM service. The SysML measures of effectiveness
(MoEs) are used to capture these KPIs using constrained block diagrams as follows. A
40
CSF for DM is to carry out a specific number of customer demand patterns. The higher
this number is, the more effective the DM service becomes. An organization can specify
a certain KPI that is both achievable and realistic. Another CSF for DM is to carry out a
customer profile analysis to determine the behaviors of customers; the associated KPI is a
predefined number of customer profiles analyzed. Figure 3-7 displays how these
measures can be captured via the constrained block in the proposed MBSE.
Figure 3-7. Encapsulating CSFs and KPIs in the Definition of Demand Management
Service.
In Figure 3-8, an overall MBSE representation for the DM service is depicted.
Model components are defined according to their respective requirements, behaviors,
structures, or parametric diagrams. Relationships between some of these components are
denoted by the arrows in the figure. Referring to the architectural framework on which
this model is based, the levels of abstraction are defined such that a higher abstraction
corresponds to the top row and more detailed abstractions correspond to the lower rows.
41
The DM service modeling is one part of the overall ITIL framework model that an
organization may choose to construct when implementing ITIL modules. The proposed
MBSE is employed to represent other modules and services, with careful attention paid
towards selecting the proper levels of abstraction and capturing multiple stakeholders’
viewpoints. Notably, SysML is generally used to capture the model of the ITIL
framework but will need to be supplemented with an analytical capability to support the
simulation aspects of the entire ITIL implementation.
42
Figure 3-8. MBSE Representation of Demand Management Service.
43
The above was a demonstration of how the DM process is modeled using MBSE.
While we have modeled the entire ITIL framework using MBSE, we will further
illustrate here the project management of another ITIL core service. The implementation
of other services follows a similar pattern. Consider the incident management (IM)
service of the ITIL service operation module. In the case of an incident, IM aims to
restore service operations as soon as possible to minimize service downtime and maintain
customer satisfaction at the desired levels.
The ITIL framework defines nine main requirements to ensure that this service
restoration takes place. Using SysML, Figure 3-9, 3-10, and 3-11 models these
requirements in a requirement diagram and provides an IM service breakdown and the
associated stakeholders who are involved in the IM service from the time a user reports
the incident until it is resolved.
44
Figure 3-9. Modeling Incident Management Requirements Using SysML.
45
Figure 3-10. Modeling Incident Management Hierarchy Using SysML.
Figure 3-11. Modeling Incident Management Stakeholder Relationships Hierarchy
Using SysML.
In this MBSE stage, the IM service is designed by refining the model. Project
teams decide on which activities the service restoration must follow, who is responsible
for them, and in which sequence. Each step is further decomposed and refined in the
model to capture the available resources that the organization aims to deploy in this
46
service. The SysML diagrams used in this step include both activity and sequence
diagrams, as depicted in Figures 3-12, 3-13, and 3-14.
Figure 3-12. Modeling Incident Management Activities Using SysML.
47
Figure 3-13. Modeling the Identify and Log Incident Activities Using SysML.
48
Figure 3-14. Modeling the Incident Management Using a Sequence Diagram.
The proposed MBSE approach supports the modeling of trade studies to assess
specific design criteria. The ITIL v3 framework provides a set of standard CSFs and key
performance indicators (KPIs) for the IM service. The SysML measures of effectiveness
(MoEs) are used to capture these KPIs using constrained block diagrams as follows. A
CSF for IM is to restore the service within a defined time. Another CSF for IM is to
respond to the customer’s complaint within a predefined number of minutes.
To simulate the MBSE representation for the IM service requirements and design
developed earlier, a set of additional artifacts are required, as shown in Figures 3-15, 3-
16, and 3-17. The additional artifacts are essentially used to define the use case that the
49
manager would be interested in simulating. Our MBSE approach proposes the application
of a use case diagram to manage incidents, an activity diagram to pass simulation
parameters to the simulation engine, and a state machine diagram to trigger the
simulation exercise and collect output data. The complete MBSE representation for IM is
depicted in Figure 3-18.
Figure 3-15. Simulating the Incident Management Model Using a State Machine
Diagram.
Figure 3-16. Use Case Diagram of Incident Management.
50
Figure 3-17. Invoking MATLAB to Simulate the Incident Management Model.
51
Figure 3-18. MBSE Representation for Incident Management Service.
52
Chapter 4—Results
4.1 Introduction
As the management continues to model how ITIL is implemented within its
organization’s context using MBSE, it may opt to simulate how the developed model
behaves under specific assumptions and whether it generates results that meet business
and functional requirements. Simulation generates insights that assist organizations in
making informed decisions regarding the ITIL levels of abstraction and implementation
sequence in a risk-free environment where errors can be made and corrected.
Furthermore, simulation significantly lowers the cost of the ITIL implementation
when different implementation scenarios are modeled, simulated, and analyzed.
Simulating the proposed MBSE approach renders additional benefits, including the
ability to design ITIL implementation alternatives (i.e., to consider their use or
complexity) and to eventually decide which of the selected ITIL implementation choices
would be a suitable fit in terms of the organization’s resources while keeping customer
satisfaction at the desired levels.
The IM scenario modeled earlier is simulated and analyzed in this chapter. Table
4-1 shows the standard ITIL framework requirements and the main CSFs with their
associated KPIs for the IM service. The associated modules for these requirements are
also listed in Table 4-1.
53
Table 4-1. Subset of IM Requirements, Activities, Main CSFs and Associated KPIs
based on the ITIL Framework (Axels, 2011).
Artifact Description ITIL module
Requirements 1. Incident processing and
handling should be aligned
with the overall service
levels and objectives.
2. All incidents should
subscribe to a standard
classification scheme that is
consistent across the
business organization.
Service Strategy
3. Incidents must be resolved
within timeframes
acceptable to the business.
4. Customer satisfaction must
be maintained at all times.
5. All incident records should
utilize a common format and
set of information fields.
6. A common and agreed-upon
set of criteria for prioritizing
Service Design
54
and escalating incidents
should be in place.
7. All incidents should be
stored and managed in a
single management system.
Service Transition
8. Incident records should be
audited on a regular basis to
ensure that they have been
entered and categorized
correctly.
Service Operation
9. Incidents and their statuses
must be effectively
communicated in a timely
manner.
Service Continual
Improvement
CSFs 1. Customer satisfaction: Respond to the customer as
quickly as possible while minimizing the impact to
the business.
KPI 1: Mean waiting time. Waiting time is defined as
the time from when a customer contacts the service
desk until a service agent acknowledges and logs the
customer complaint.
2. Risk: Maintain availability of IT services.
55
KPI 2: Mean downtime. Downtime is defined as the
total time when an IT service is not available. It is
often calculated from the time when a customer
contacts the service desk until the incident is fully
resolved.
3. Efficiency: Categorize, escalate, and process
incidents.
KPI 3: Mean service time before escalation.
Escalation happens when a lower-level support agent
forwards an incident to the next immediately higher
support level.
The ITIL framework recommends three levels when organizing the IM service:
Level 1 Support, Level 2 Support, and Level 3 Support. Escalation occurs among these
support levels if an incident cannot be resolved at the lower support level while keeping
the same identification and logging for the incident. Using discrete event simulation,
incidents are randomly created, identified, escalated and resolved.
4.2 An ITIL Implementation: Case Study
A recent IM service implementation scheme used internally by a commercial
bank is considered and analyzed (BPI Challenge
2014:http//www.win.tue.nl/bpi/2014/challenge). The measurements are cleansed and
prepared for analysis, and the results indicate that there are three support levels with five
agents, three agents, and two agents allocated to Level 1 Support, Level 2 Support, and
56
Level 3 Support, respectively. The service operation hours are Monday through Saturday
from 08:00 to 16:00 with an average of 250 incidents logged per day. Based on the CSFs
and KPIs in Table 4-1, the daily statistics of this bank’s implementation of the IM service
are summarized in Table 4-2.
Table 4-2. Performance Summary of the Commercial Bank’s IM Service
Implementation.
Min Max Mean Standard
deviation
KPI 1: Waiting time
(min) 14.63 151.00 67.58 35.82
KPI 2: Downtime (min) 39.22 1441.23 414.59 326.71
KPI 3a: Time to escalate
to Level 2 Support (min)
72.88 347.00 156.15 78.09
KPI 3b: Time to escalate
to Level 3 Support (min)
104.00 574.23 217.90 117.39
It is evident from Table 4-2 that there are inherent deficiencies in this
implementation. First, there is a large variability in the waiting time ranging from 14
minutes to 2.5 hours. This variability and extended waiting time results in heightened
customer dissatisfaction levels. Second, the downtime with a mean of 414 minutes (~7
hours) and a maximum of 1441 minutes (~24 hours) represents a risk with which the
bank must be concerned. Third, the measurements indicate that there are inefficient
procedures that agents at lower support levels follow to escalate incidents. This is
57
supported by the largely dispersed times of escalation, i.e., from 73 to 347 minutes in the
case of escalations made by Level 1 Support and from 104 to 574 minutes in the case of
escalations made by Level 2 Support.
To employ the proposed MBSE approach for this implementation, two inputs are
required. First, the characteristics of the bank’s ITIL implementation must be extracted
and modeled based on the available measurements. Second, the specific KPI values
should be provided. There are three main variables to be modeled. These variables are the
incident arrival rate, the time that each agent takes to resolve the assigned incident
(resolution time), and the time it takes each agent to escalate an incident to the
immediately higher support level. It is assumed that agents within the same support level
possess identical knowledge and skills and hence can be modeled identically.
The variables are modeled as random variables and fitted with distributions, as
shown in Table 4-3. Using the chi-square goodness-of-fit test and the p-values in Table 4-
3, one can conclude that the selected probability distributions statistically provide
excellent fits of the characteristics of the bank’s ITIL implementation. The fitted
probability distributions together with the associated probability plots are displayed in
Figures 4-1 through 4-6.
58
Table 4-3. Summary of the Fitted Probability Distributions.
Probability distribution Quality of fit
(p-value) (with
95%
confidence
interval)
Mean Standard
deviation
Incident arrivals Exponential 13.54 - 0.432
Resolution
time
Level 1 Normal 60.34 71.99 0.528
Level 2 Normal 70.37 101.4 0.984
Level 3 Normal 205.9 100.7 0.695
Escalation
time
Time to
escalate
from Level
1 to Level
2
Normal 168.3 93.0 0.524
Time to
escalate
from Level
2 to Level
3
Normal 194.3 108.89 0.900
59
The bank’s management has a set of specific target KPIs (Table 4-4). For
example, the mean waiting time is limited to 30 minutes compared with the current mean
waiting time of 67 minutes. This requirement aims to enhance the customer’s satisfaction
levels.
Figure 4-1. Incident Arrivals - Histogram.
706050403020100
90
80
70
60
50
40
30
20
10
0
Mean 13.54
N 250
Time Between Incident Arrivals (min)
Fre
qu
en
cy
Histogram of Time Between Incident ArrivalsExponential
60
Figure 4-2. Incident Arrivals – Probability Plot.
61
Figure 4-3. Resolution Times - Histograms.
375300225150750
0.009
0.008
0.007
0.006
0.005
0.004
0.003
0.002
0.001
0.000
57.94 76.97 125
70.37 101.4 100
205.9 100.7 25
Mean StDev N
R
ytisn
eD
)nim( emiT noitulose
L
elbairaV
emiT noituloseR 3 leveL
emiT noituloseR 2 leveL
emiT noituloseR 1 leve
N
leveL troppuS hcaE rof emiT noituloseR fo margotsiH lamro
62
Figure 4-4. Resolution Times – Probability Plots.
63
Figure 4-5. Times to Escalate - Histograms.
5004003002001000
0.005
0.004
0.003
0.002
0.001
0.000
168.3 93.00 78
194.3 108.0 89
Mean StDev N
D
ytisn
eD
ata
T
elbairaV
3L ot 2L morf etalacsE ot emiT
2L ot 1L morf etalacsE ot emi
H lamroN
2L morf etalacsE ot emiT ,1L morf etalacsE ot emiT fo margotsi
64
Figure 4-6. Times to Escalate – Probability Plots.
Furthermore, the mean downtime should be 4 hours instead of the current mean
downtime of 7 hours to ensure compliance with the desired operational risk tolerance of
the bank. Additionally, it is required that agents escalate to the immediately higher
support level if they spend more than 120 minutes on a given incident at Level 1 Support
and more than 240 minutes at Level 2 Support. The purpose of introducing this last
requirement is to ensure that an efficient and consistent escalation procedure exists and is
followed.
65
Table 4-4. Specific Target KPIs.
KPIs Target
KPI 1: Mean waiting time (min) 30
KPI 2: Mean downtime (min) 240
KPI 3a: Mean time to escalate to Level 2 Support (min) 120
KPI 3b: Mean time to escalate to Level 3 Support (min) 240
A Monte Carlo simulation with 100,000 iterations is performed to capture the
proposed MBSE IM service implementation results, as shown in Table 4-5. An average
of 270 incidents is randomly generated. The results are analyzed in two stages. The first
stage is performed to statistically test whether the mean waiting time and downtime in the
proposed MBSE approach are significantly less than those in the bank’s implementation.
At a confidence level of 95% with a left-tailed t-test, the research hypotheses for this
stage are as follows:
Waiting time (left-tailed t-test)
Ho: MBSE’s mean waiting time is greater than or equal to the
measurements’ mean waiting time
Ha: MBSE’s mean waiting time is less than the measurements’
mean waiting time
Downtime (left-tailed t-test)
Ho: MBSE’s mean downtime is greater than or equal to the
measurements’ mean downtime
66
Ha: MBSE’s mean downtime is less than the measurements’ mean
downtime.
Table 4-5. Implementation Improvement Results using the Proposed MBSE
Approach.
Measurements MBSE results T-
Value
Null hypothesis
(accept/reject) Mean Standard
deviation
Mean Standard
deviation
Waiting
time
(min)
67.58 35.82 28.85 39.36 11.75 Reject
Downti
me
(min)
414.59 326.71 105.01 62.96 14.73 Reject
Rejecting both null hypotheses indicates that the mean waiting time and mean
downtime are reduced when the proposed MBSE approach is employed. Further
correlation analysis is conducted on both the downtime and waiting time residuals.
Figures 4-7 and 4-8 show the autocorrelation functions of the downtime and waiting time
residuals. Although there are slight indications of autocorrelation patterns, both the
residuals demonstrate high degrees of lead-lag independence. Figure 4-9 depicts the
cross-correlation between the residuals of downtime and waiting time. This figure shows
an obvious correlation between the two residuals while consistently indicating the lead-
67
lag independence between the two residuals. The complete list of actual measurements
and MBSE model’s outputs is available in Appendix A.
Figure 4-7. Autocorrelation Function of Downtime Residuals.
68
Figure 4-8. Autocorrelation Function of Waiting Time Residuals.
69
Figure 4-9. Cross Correlation Function of Waiting Time and Downtime Residuals.
The second stage of statistical validation is conducted to prove the adherence of
the MBSE’s results to the target KPI values. With a 95% confidence interval, a set of
hypotheses is tested regarding whether the simulation results generate values that are less
than or equal to the target KPIs for the waiting time and downtime and values that are
equal to the target time to escalate. Table 4-6 illustrates that the proposed MBSE IM
service implementation in the bank does adhere to the target KPI values. For this stage,
the following hypotheses are formulated:
Waiting time (left-tailed z-test)
Ho: MBSE’s mean waiting time is greater than the target mean
waiting time
70
Ha: MBSE’s mean waiting time is less than or equal to the
target mean waiting time
Downtime (left-tailed z-test)
Ho: MBSE’s mean downtime is greater than the target mean
downtime
Ha: MBSE’s mean downtime is less than or equal to the target
mean downtime
Time to escalate (two-tailed z-test)
Ho: MBSE’s mean time to escalate is different from the target
mean time to escalate
Ha: MBSE’s mean time to escalate is not different from the
target mean time to escalate.
71
Table 4-6. Adherence to the Target KPI Values using the Proposed MBSE.
Target KPI
values
MBSE results p-value Null
hypothesis
(accept/reject)
Mean Standard
deviation
Waiting time
(min)
30 28.85 39.36 0.323 Reject
Downtime (min) 240 105.01 62.96 1.000 Reject
Time to escalate
to Level 2
Support (min)
120 120 2.03 1.000 Reject
Time to escalate
to Level 3
Support (min)
240 240 0.2 1.000 Reject
The above simulation experiment demonstrates how MBSE can be used to
improve the implementation of an ITIL service while integrating it with other relevant
services starting from the requirement definition stage to the validation of each
requirement using Monte Carlo simulations. Using MBSE enables the effectiveness of
management in designing IT services that fulfill the defined requirements and
successfully meet the target objective. Depending on the scope of the ITIL
implementation that an organization intends to perform, MBSE can be employed to bring
the rigor of SE to the implementation initiative. This research attempted to demonstrate
72
how this utility is possible for ITIL services and how it can be scaled to cover a larger
scope of implementation.
73
Chapter 5—Conclusions, Challenges, and Recommendations for Future Research
5.1 Conclusions
MBSE has been widely used to provide a systematic method for designing and
modeling products and processes in different domains. However, little effort has been
made to explore how MBSE can benefit the IT management domain. This report
presented an approach that uses MBSE to model the implementation of ITIL. All five
ITIL modules were considered together in this approach as contrasted with previous
studies that examined only individual ITIL modules. The standard requirements of ITIL
were based on the ITIL v3 reference manuals, and the different artifacts were captured
using SysML. The ITIL implementation initiative is managed as an SE project via MBSE
in which stakeholders are asked to provide their views while the model is being
developed with less reliance on documents and more reliance on the digital representation
of the entire implementation project.
With the proposed approach, stakeholders are able to conduct simulations and
observe the predicted behavior of the planned implementation and are able to adjust their
understanding of how the implementation should be carried out to fulfill the project
requirements and achieve the project objectives. Although this methodology can be
applied to create a model of the entire system, this study provided a specific example of
how MBSE can be used in the detailed design of an IM service. The proposed model is
validated via simulations based on available real world measurements from a commercial
bank.
5.2 Challenges
74
Although our approach has produced very promising results, challenges remain
with this area of research. The sparsity of the literature on the use of MBSE for the
representation of the ITIL implementation provides limited comparisons with prior
research efforts. However, this new study will support future research initiatives that
compare and contrast MBSE-based ITIL implementation with non-MBSE-based
approaches. In addition, although it has been well-established that MBSE enables
organizations to have better communication among their teams when building models for
their systems of interest, this research could benefit from further surveys that measure the
various benefits that MBSE brings to organizations that aim to or are in the process of
implementing ITIL in their workplace. Furthermore, organizations that plan to employ
MBSE need to develop some experience on how to model IT services using MBSE
artifacts.
5.3 Recommendations for Future Research
This research recommends that organizations adopt the proposed MBSE approach
when deciding to implement the ITIL framework and to fully capture the service
processes in the model itself. In this way, an organization will gain far more insight into
how the ITIL framework should be implemented while addressing existing challenges,
especially those related to decision-making, risk management, and quality improvement.
We also recommend that organizations integrate the proposed MBSE approach with
mature frameworks to arrive at an ITIL implementation that continuously improves with
better value realization and enhanced returns on investments.
75
References
Aier, S., Bucher, T., & Winter, R. (2011). Critical success factors of service orientation in
information systems engineering. Business & Information Systems Engineering,
3(2), 77-88.
Ahlemann, F. (2009). Towards a conceptual reference model for project management
information systems. International Journal of Project Management, 27(1), 19-30.
Ahmad, N., & Shamsudin, Z. M. (2013). Systematic approach to successful
implementation of ITIL. Procedia computer science, 17, 237-244.
AlShamy, M. M., Elfakharany, E., & ElAziem, M. A. (2012). Information technology
service management (ITSM) implementation methodology based on information
technology infrastructure library ver. 3 (ITIL V3). International Journal of
Business Research and Management, 3(3), 113-132.
Axelos (2011). v3–IT Infrastructure Library.
Axelos (2017). IT service management benchmarking report.
Bartolini, C., Stefanelli, C., & Tortonesi, M. (2008, September). SYMIAN: A simulation
tool for the optimization of the IT incident management process. In International
Workshop on Distributed Systems: Operations and Management (pp. 83-94).
Springer, Berlin, Heidelberg.
Bernard P (2014) IT service management based on ITIL® 2011 edition. Van Haren,
Zaltbommel.
Bjorkman, E. A., Sarkani, S., & Mazzuchi, T. A. (2013). Using model‐based systems
engineering as a framework for improving test and evaluation activities. Systems
Engineering, 16(3), 346-362.
76
Carley, K. (1994). Sociology: Computational organization theory. Social Science
Computer Review, 12(4), 611-624.
Cho, S. H., & Eppinger, S. D. (2005). A simulation-based process model for managing
complex design projects. IEEE Transactions on engineering management, 52(3),
316-328.
Crisp H (2007) Systems engineering vision 2020. INCOSE, Seattle, Washington
Cronholm, S., & Persson, L. (2012). Best Practice in IT Service Management:
Experienced Strengths and Weaknesses of Using ITIL. In ICMLG2016-4th
International Conference on Management, Leadership and Governance:
ICMLG2016 (p. 60). Academic Conferences and publishing limited.
Diirr, T., & Santos, G. (2014). Improvement of IT service processes: a study of critical
success factors. Journal of Software Engineering Research and Development,
2(1), 4.
Eikebrokk, T. R., & Iden, J. (2016). Enabling a culture for IT services; the role of the IT
infrastructure library. International Journal of Information Technology and
Management, 15(1), 14-40.
Ebner, K., Mueller, B., Urbach, N., Riempp, G., & Krcmar, H. (2016). Assessing IT
Management's Performance: A Design Theory for Strategic IT Benchmarking.
IEEE Transactions on Engineering Management, 63(1), 113-126.
Farr, J. V., & Buede, D. M. (2003). Systems engineering and engineering management:
Keys to the efficient development of products and services. Engineering
Management Journal, 15(3), 3-9.
77
Gelbard, R., Pliskin, N., & Spiegler, I. (2002). Integrating system analysis and project
management tools. International Journal of Project Management, 20(6), 461-468.
Gacenga, F., Cater-Steel, A., & Toleman, M. (2010). An international analysis of IT
service management benefits and performance measurement. Journal of Global
Information Technology Management, 13(4), 28-63.
Iden, J., & Eikebrokk, T. R. (2013). Implementing IT Service Management: A systematic
literature review. International Journal of Information Management, 33(3), 512-
523.
Iden, J., & Eikebrokk, T. R. (2014a). Exploring the relationship between information
technology infrastructure library and process management: theory development
and empirical testing. Knowledge and Process Management, 21(4), 292-306.
Iden, J., & Eikebrokk, T. R. (2014b). Using the ITIL process reference model for
realizing IT governance: An empirical investigation. Information Systems
Management, 31(1), 37-58.
International Council on Systems Engineering (INCOSE) SE Handbook Working Group.
(2011). Systems engineering handbook: A guide for system life cycle processes
and activities. San Diego, CA, USA, 1-386.
Izukura S, Yanoo K, Osaki T, Sakaki H, Kimura D, Xiang J Applying a model-based
approach to IT systems development using SysML extension. In: International
conference on model driven engineering languages and systems2011. Springer,
Berlin, Heidelberg, pp 563-577
Izukura, S., Yanoo, K., Sakaki, H., & Kawatsu, M. (2013, July). Determining appropriate
IT systems design based on system models. In Computer Software and
78
Applications Conference (COMPSAC), 2013 IEEE 37th Annual (pp. 834-835).
IEEE.
Jelliti, M., Sibilla, M., Jamoussi, Y., & Ghezala, H. B. (2010). A model based framework
supporting ITIL service IT management. In Enterprise, Business-Process and
Information Systems Modeling (pp. 208-219). Springer, Berlin, Heidelberg.
Lema, L., Calvo‐Manzano, J. A., Colomo‐Palacios, R., & Arcilla, M. (2015). ITIL in
small to medium‐sized enterprises software companies: towards an
implementation sequence. Journal of Software: Evolution and Process, 27(8),
528-538.
Lima, A., Sauve, J., & Souza, N. (2012). Capturing the quality and business value of IT
services using a business-driven model. IEEE Transactions on Network and
Service Management, 9(4), 421-432.
Lima, A. S., de Souza, J. N., Moura, J. A. B., & da Silva, I. P. (2018). A Consensus-
Based Multicriteria Group Decision Model for Information Technology
Management Committees. IEEE Transactions on Engineering Management.
Locatelli, G., Mancini, M., & Romano, E. (2014). Systems Engineering to improve the
governance in complex project environments. International Journal of Project
Management, 32(8), 1395-1410.
Manoel, L. G., Bouzada, M. A. C., Alencar, A. J., da Silveira Ramos, A. A., & do
Fundao, C. U. I. (2017). Computer Simulation Improving the IT Helpdesk
Problem Management: A Systematic Literature Review. International Business
Management, 11(1), 68-77.
79
Marrone M,, & Kolbe L. (2011) Impact of IT Service Management Frameworks
on the IT Organization. Business & Information Systems Engineering 3,5-18.
Mikaelian, T., Nightingale, D. J., Rhodes, D. H., & Hastings, D. E. (2011). Real options
in enterprise architecture: a holistic mapping of mechanisms and types for
uncertainty management. IEEE Transactions on Engineering Management, 58(3),
457-470.
Motamedian, B. (2013). MBSE applicability analysis. International Journal of Scientific
and Engineering Research, 4(2), 7.
Müller, S. D., & de Lichtenberg, C. G. (2018). The culture of ITIL: Values and
implementation challenges. Information Systems Management, 35(1), 49-61.
Nicho, M., & Mourad, B. A. (2012). Success factors for integrated ITIL deployment: An
IT governance classification. Journal of Information Technology Case and
Application Research, 14(1), 25-54.
Nikolaidou, M., Kapos, G. D., Tsadimas, A., Dalakas, V., & Anagnostopoulos, D. (2015,
May). Simulating SysML models: Overview and challenges. In System of
Systems Engineering Conference (SoSE), 2015 10th (pp. 328-333). IEEE.
Nikolaidou, M., Kapos, G. D., Tsadimas, A., Dalakas, V., & Anagnostopoulos, D.
(2016). Challenges in SysML Model Simulation. Advances in Computer Science:
an International Journal, 5(4), 49-56.
Orta, E., & Ruiz, M. (2014). A simulation approach to decision making in IT service
strategy. The Scientific World Journal, 2014.
Orta, E., Ruiz, M., Hurtado, N., & Gawn, D. (2014). Decision-making in IT service
management: A simulation based approach. Decision Support Systems, 66, 36-51.
80
Overhage, S., Skroch, O., & Turowski, K. (2010). A Method to Evaluate the Suitability
of Requirements Specifications for Offshore Projects. Business & Information
Systems Engineering, 2(3), 155-164.
Pereira RF, Silva MM A maturity model for implementing ITIL v3. In: Proceedings of
the 2010 6th world congress on services. IEEE Computer Society, Washington,
DC, pp 399-406
Pillai, A. K. R., Pundir, A. K., & Ganapathy, L. (2014). Improving information
technology infrastructure library service delivery using an integrated lean six
sigma framework: A case study in a software application support scenario.
Journal of Software Engineering and Applications, 7(06), 483.
Ramos, A. L., Ferreira, J. V., & Barceló, J. (2012). Model-based systems engineering: An
emerging approach for modern systems. IEEE Transactions on Systems, Man, and
Cybernetics, Part C (Applications and Reviews), 42(1), 101-111.
Romero, H. L., Dijkman, R. M., Grefen, P. W., & van Weele, A. J. (2015). Factors that
determine the extent of business process standardization and the subsequent effect
on business performance. Business & Information Systems Engineering, 57(4),
261-270.
Sebaaoui, S., & Lamrini, M. (2012). Implementation of ITIL in a Moroccan company:
the case of incident management process. International Journal of Computer
Science, 9(3-4), 30-36.
Sharon, A., de Weck, O. L., & Dori, D. (2013). Improving project–product lifecycle
management with model–based design structure matrix: a joint project
81
management and systems engineering approach. Systems Engineering, 16(4),
413-426.
Silva, A., Varajão, J., Pereira, J. L., & Pinto, C. S. (2017). Performance Appraisal
Approaches and Methods for IT/IS Projects: A Review. International Journal of
Human Capital and Information Technology Professionals (IJHCITP), 8(3), 15-
28.
Tsadimas, A., Kapos, G. D., Dalakas, V., Nikolaidou, M., & Anagnostopoulos, D.
(2016). Simulating simulation-agnostic SysML models for enterprise information
systems via DEVS. Simulation Modelling Practice and Theory, 66, 243-259.
Walden, D. D., Roedler, G. J., Forsberg, K., Hamelin, R. D., & Shortell, T. M. (2015).
Systems engineering handbook: A guide for system life cycle processes and
activities. John Wiley & Sons.
Valverde, R., & Talla, M. (2014). DSS Based IT Service Support Process Reengineering
Using ITIL: A Case Study. In Engineering and Management of IT-based Service
Systems (pp. 35-65). Springer, Berlin, Heidelberg.
Zhu, J., & Mostafavi, A. (2017). Discovering complexity and emergent properties in
project systems: A new approach to understanding project performance.
International Journal of Project Management, 35(1), 1-12
82
Appendix A: Results of Model Validation
A Monte Carlo simulation with 100,000 iterations is performed to capture the
proposed MBSE IM service implementation results, as shown in Table 4-5. An average
of 270 incidents is randomly generated. The results are analyzed to statistically test
whether the mean waiting time and downtime in the proposed MBSE approach are
significantly less than those in the bank’s implementation. At a confidence level of 95%
with a left-tailed t-test, the research hypotheses are as follows:
Waiting time (left-tailed t-test)
Ho: MBSE’s mean waiting time is greater than or equal to the
measurements’ mean waiting time
Ha: MBSE’s mean waiting time is less than the measurements’
mean waiting time
Downtime (left-tailed t-test)
Ho: MBSE’s mean downtime is greater than or equal to the
measurements’ mean downtime
Ha: MBSE’s mean downtime is less than the measurements’ mean
downtime.
Rejecting both null hypotheses indicates that the mean waiting time and mean
downtime are reduced when the proposed MBSE approach is employed. The complete
list of actual measurements and MBSE model’s outputs is available in this Appendix.
83
Actual Measurements (min) Model Outputs (min)
Measurement-Model
Error
Incid
ent ID
Loggin
g T
ime
Assig
nm
ent
Tim
e
Clo
sure T
ime
Reso
lutio
n
Tim
e
Waitin
g T
ime
Dow
ntim
e
Reso
lutio
n
Tim
e
Waitin
g T
ime
Dow
ntim
e
Reso
lutio
n
Tim
e Erro
r
Waitin
g T
ime
Erro
r
Dow
ntim
e
Erro
r
IM0024814
9:26:00
AM
11:00:00
AM
11:02:00
AM 2.00 94.00 96.00 48.95 0.00 48.95 46.95 14.00 -2.95
IM0024961
9:22:00
AM
9:23:00
AM
9:25:00
AM 2.00 1.00 3.00 48.00 0.00 48.00 46.00 -79.00 -95.00
IM0024979
10:42:0
0 AM
11:05:00
AM
11:07:00
AM 2.00 23.00 25.00 37.22 0.00 37.22 35.22 -57.00 -62.22
IM0025050
10:45:0
0 AM
10:46:00
AM
2:03:00
PM 197.00 1.00 198.00 76.94 0.00 76.94
-
120.06 -79.00 71.06
IM0025331
10:43:0
0 AM
1:14:00
PM
1:58:00
PM 44.00 151.00 195.00 107.92 0.00 107.92 63.92 71.00 37.08
IM0025333
10:55:0
0 AM
11:48:00
AM
12:47:00
PM 59.00 53.00 112.00 47.61 0.00 47.61 -11.39 -27.00 14.39
IM0025579
8:37:00
AM
8:41:00
AM
8:57:00
AM 16.00 4.00 20.00 39.05 0.00 39.05 23.05 -76.00 -69.05
84
IM0025735
1:47:00
PM
1:55:00
PM
2:03:00
PM 8.00 8.00 16.00 131.07 0.00 131.07 123.07 -72.00
-
165.07
IM0025906
9:27:00
AM
10:32:00
AM
4:19:00
PM 347.00 65.00 412.00 114.05 0.00 114.05
-
232.95 -15.00 247.95
IM0025985
9:35:00
AM
10:26:00
AM
10:45:00
AM 19.00 51.00 70.00 45.16 0.00 45.16 26.16 -29.00 -25.16
IM0026002
10:20:0
0 AM
11:30:00
AM
11:31:00
AM 1.00 70.00 71.00 46.95 0.00 46.95 45.95 -10.00 -25.95
IM0026003
10:24:0
0 AM
1:45:00
PM
1:47:00
PM 2.00 201.00 203.00 32.07 0.00 32.07 30.07 121.00 120.93
IM0026079
11:57:0
0 AM
1:03:00
PM
1:58:00
PM 55.00 66.00 121.00 99.34 0.00 99.34 44.34 -14.00 -28.34
IM0026213
2:04:00
PM
2:44:00
PM
2:54:00
PM 10.00 40.00 50.00 90.63 0.00 90.63 80.63 -40.00 -90.63
IM0026222
2:29:00
PM
3:37:00
PM
3:49:00
PM 12.00 68.00 80.00 46.25 0.00 46.25 34.25 -12.00 -16.25
IM0026275
3:17:00
PM
3:47:00
PM
4:09:00
PM 22.00 30.00 52.00 182.41 0.00 182.41 160.41 -50.00
-
180.41
IM0026453
10:28:0
0 AM
10:43:00
AM
2:37:00
PM 234.00 15.00 249.00 33.66 0.00 33.66
-
200.34 -65.00 165.34
85
IM0026455
10:36:0
0 AM
1:36:00
PM
1:40:00
PM 4.00 180.00 184.00 34.67 0.00 34.67 30.67 100.00 99.33
IM0026568
2:11:00
PM
2:14:00
PM
2:16:00
PM 2.00 3.00 5.00 84.55 0.00 84.55 82.55 -77.00
-
129.55
IM0026749
9:54:00
AM
10:34:00
AM
1:09:00
PM 155.00 40.00 195.00 226.48 0.00 226.48 71.48 -40.00 -81.48
IM0027087
10:24:0
0 AM
10:44:00
AM
10:53:00
AM 9.00 20.00 29.00 247.39 0.00 247.39 238.39 -60.00
-
268.39
IM0027119
12:59:0
0 PM
1:00:00
PM
1:44:00
PM 44.00 1.00 45.00 98.18 0.00 98.18 54.18 -79.00
-
103.18
IM0027233
11:59:0
0 AM
2:26:00
PM
4:10:00
PM 104.00 147.00 251.00 47.45 0.00 47.45 -56.55 67.00 153.55
IM0027488
10:25:0
2 AM
10:27:09
AM
10:29:46
AM 2.62 2.12 4.73 293.94 0.00 293.94 291.32 -77.88
-
339.20
IM0027502
9:02:31
AM
10:01:17
AM
10:02:37
AM 1.33 58.77 60.10 45.89 0.00 45.89 44.56 -21.23 -35.79
IM0027560
11:06:5
7 AM
11:45:24
AM
1:33:47
PM 108.38 38.45 146.83 77.30 0.00 77.30 -31.08 -41.55 19.53
IM0027565
11:31:4
9 AM
11:46:27
AM
1:39:22
PM 112.92 14.63 127.55 107.09 0.00 107.09 -5.83 -65.37 -29.54
86
IM0027784
8:56:12
AM
10:53:27
AM
10:55:45
AM 2.30 117.25 119.55 72.92 0.00 72.92 70.62 37.25 -3.37
IM0027822
9:45:08
AM
1:37:49
PM
1:47:26
PM 9.62 232.68 242.30 165.04 47.3 212.40 155.42 105.33 -20.10
IM0027825
9:47:22
AM
1:38:05
PM
1:45:00
PM 6.92 230.72 237.63 47.01 0.00 47.01 40.09 150.72 140.62
IM0027828
9:57:17
AM
1:38:23
PM
1:49:46
PM 11.38 221.10 232.48 57.47 0.00 57.47 46.08 141.10 125.02
IM0027956
10:54:1
8 AM
2:31:24
PM
2:50:45
PM 19.35 217.10 236.45 39.48 0.00 39.48 20.13 137.10 146.97
IM0027957
10:57:2
8 AM
1:38:55
PM
2:34:04
PM 55.15 161.45 216.60 38.43 0.00 38.43 -16.72 81.45 128.17
IM0028048
1:40:47
PM
3:06:08
PM
3:43:34
PM 37.43 85.35 122.78 97.13 0.00 97.13 59.70 5.35 -24.35
IM0028217
9:34:37
AM
10:41:32
AM
11:26:17
AM 44.75 66.92 111.67 36.61 0.00 36.61 -8.14 -13.08 25.05
IM0028413
3:34:13
PM
3:35:46
PM
4:13:21
PM 37.58 1.55 39.13 101.28 26.0 127.31 63.70
-
104.48
-
138.18
IM0028564
10:15:5
8 AM
4:39:27
PM
4:42:23
PM 2.93 383.48 386.42 60.99 0.00 60.99 58.05 303.48 275.43
87
IM0028566
10:35:1
5 AM
2:49:45
PM
2:56:39
PM 6.90 254.50 261.40 90.98 0.00 90.98 84.08 174.50 120.42
IM0028591
8:56:52
AM
2:50:46
PM
3:11:35
PM 20.82 353.90 374.72 30.07 0.00 30.07 9.26 273.90 294.64
IM0028709
12:49:0
2 PM
2:26:33
PM
2:30:16
PM 3.72 97.52 101.23 46.68 0.00 46.68 42.96 17.52 4.55
IM0028712
12:58:2
0 PM
12:59:36
PM
2:29:59
PM 90.38 1.27 91.65 193.38 4.50 197.88 102.99 -83.24
-
156.23
IM0028826
9:07:29
AM
10:24:34
AM
3:54:15
PM 329.68 77.08 406.77 32.21 0.00 32.21
-
297.47 -2.92 324.55
IM0028834
10:04:3
5 AM
10:25:58
AM
10:52:14
AM 26.27 21.38 47.65 41.91 0.00 41.91 15.64 -58.62 -44.26
IM0028877
12:31:2
4 PM
12:35:20
PM
3:20:33
PM 165.22 3.93 169.15 106.33 0.00 106.33 -58.88 -76.07 12.82
IM0028885
8:31:59
AM
11:39:11
AM
11:51:54
AM 12.72 187.20 199.92 47.18 0.00 47.18 34.46 107.20 102.74
IM0028898
9:32:07
AM
11:39:26
AM
11:56:54
AM 17.47 127.32 144.78 156.74 0.00 156.74 139.27 47.32 -61.95
IM0029068
4:09:10
PM
4:53:27
PM
4:55:39
PM 2.20 44.28 46.48 45.70 0.00 45.70 43.50 -35.72 -49.22
88
IM0029135
9:11:26
AM
9:12:00
AM
9:13:26
AM 1.43 0.57 2.00 41.77 0.00 41.77 40.34 -79.43 -89.77
IM0029148
9:58:05
AM
9:59:07
AM
1:11:22
PM 192.25 1.03 193.28 32.01 0.00 32.01
-
160.24 -78.97 111.27
IM0029281
11:03:4
8 AM
3:41:44
PM
3:41:55
PM 0.18 277.93 278.12 53.36 0.00 53.36 53.17 197.93 174.76
IM0029285
11:28:3
9 AM
1:13:43
PM
1:26:14
PM 12.52 105.07 117.58 105.86 0.00 105.86 93.34 25.07 -38.28
IM0029287
11:42:1
3 AM
12:38:29
PM
3:13:40
PM 155.18 56.27 211.45 47.06 0.00 47.06
-
108.13 -23.73 114.39
IM0029371
2:18:35
PM
2:19:54
PM
2:21:54
PM 2.00 1.32 3.32 96.87 0.00 96.87 94.87 -78.68
-
143.55
IM0029523
9:08:12
AM
11:02:38
AM
11:04:10
AM 1.53 114.43 115.97 47.67 0.00 47.67 46.13 34.43 18.30
IM0029542
10:03:4
7 AM
10:44:07
AM
2:10:10
PM 206.05 40.33 246.38 48.50 0.00 48.50
-
157.55 -39.67 147.88
IM0029710
10:59:2
6 AM
11:10:38
AM
11:43:33
AM 32.92 11.20 44.12 37.19 0.00 37.19 4.28 -68.80 -43.08
IM0029786
11:49:5
3 AM
1:18:52
PM
2:01:09
PM 42.28 88.98 131.27 53.73 0.00 53.73 11.45 8.98 27.53
89
IM0029964
8:01:36
AM
8:27:40
AM
10:43:17
AM 135.62 26.07 161.68 229.36 0.00 229.36 93.74 -53.93
-
117.67
IM0030104
9:43:34
AM
12:25:01
PM
1:31:27
PM 66.43 161.45 227.88 52.14 0.00 52.14 -14.30 81.45 125.75
IM0030106
9:46:33
AM
9:49:55
AM
4:18:01
PM 388.10 3.37 391.47 93.22 0.00 93.22
-
294.88 -76.63 248.25
IM0030115
11:06:0
8 AM
11:10:00
AM
11:15:13
AM 5.22 3.87 9.08 61.32 0.00 61.32 56.10 -76.13
-
102.23
IM0030121
12:00:4
7 PM
12:02:00
PM
2:56:42
PM 174.70 1.22 175.92 237.24 0.00 237.24 62.54 -78.78
-
111.33
IM0030388
8:01:26
AM
8:05:12
AM
1:39:22
PM 334.17 3.77 337.93 54.32 0.00 54.32
-
279.84 -76.23 233.61
IM0030534
9:57:23
AM
9:59:16
AM
9:59:39
AM 0.38 1.88 2.27 93.88 0.00 93.88 93.50 -78.12
-
141.62
IM0030538
10:07:5
2 AM
10:09:20
AM
2:04:45
PM 235.42 1.47 236.88 31.67 0.00 31.67
-
203.75 -78.53 155.21
IM0030605
2:23:44
PM
2:25:20
PM
2:27:24
PM 2.07 1.60 3.67 39.69 0.00 39.69 37.63 -78.40 -86.03
IM0030928
11:09:2
9 AM
11:11:56
AM
11:25:12
AM 13.27 2.45 15.72 58.01 0.00 58.01 44.74 -77.55 -92.29
90
IM0031039
2:25:04
PM
2:48:49
PM
3:20:00
PM 31.18 23.75 54.93 55.58 0.00 55.58 24.40 -56.25 -50.65
IM0031536
3:31:34
PM
4:24:56
PM
4:38:14
PM 13.30 53.37 66.67 101.21 0.00 101.21 87.91 -26.63 -84.54
IM0031614
9:20:35
AM
3:10:13
PM
5:05:04
PM 114.85 349.63 464.48 35.02 0.00 35.02 -79.83 269.63 379.47
IM0031615
9:21:34
AM
10:17:52
AM
10:18:26
AM 0.57 56.30 56.87 50.39 0.00 50.39 49.82 -23.70 -43.52
IM0031618
9:38:02
AM
10:41:26
AM
3:01:28
PM 260.03 63.40 323.43 32.39 0.00 32.39
-
227.65 -16.60 241.05
IM0031646
9:21:54
AM
10:29:06
AM
10:35:26
AM 6.33 67.20 73.53 46.83 0.00 46.83 40.50 -12.80 -23.30
IM0032040
9:16:28
AM
9:21:44
AM
11:34:01
AM 132.28 5.27 137.55 87.54 0.00 87.54 -44.74 -74.73 0.01
IM0032044
9:24:53
AM
11:28:52
AM
12:09:29
PM 40.62 123.98 164.60 49.95 0.00 49.95 9.34 43.98 64.65
IM0032046
9:32:08
AM
9:43:17
AM
9:50:11
AM 6.90 11.15 18.05 43.02 0.00 43.02 36.12 -68.85 -74.97
IM0032157
10:52:1
1 AM
10:53:00
AM
10:54:42
AM 1.70 0.82 2.52 30.92 0.00 30.92 29.22 -79.18 -78.40
91
IM0032281
11:36:1
1 AM
12:07:25
PM
12:10:48
PM 3.38 31.23 34.62 166.18 0.00 166.18 162.79 -48.77
-
181.56
IM0032419
8:35:09
AM
9:43:04
AM
10:48:30
AM 65.43 67.92 133.35 172.10 0.00 172.10 106.66 -12.08 -88.75
IM0032465
9:06:02
AM
2:42:49
PM
3:22:00
PM 39.18 336.78 375.97 60.62 0.00 60.62 21.44 256.78 265.34
IM0032548
10:27:0
1 AM
3:15:37
PM
3:19:24
PM 3.78 288.60 292.38 202.45 0.00 202.45 198.67 208.60 39.93
IM0032550
10:43:4
7 AM
2:16:14
PM
2:49:58
PM 33.73 212.45 246.18 78.10 0.00 78.10 44.36 132.45 118.09
IM0032610
1:32:39
PM
2:30:32
PM
3:50:25
PM 79.88 57.88 137.77 54.78 0.00 54.78 -25.11 -22.12 32.99
IM0032634
11:54:0
3 AM
11:59:19
AM
11:59:36
AM 0.28 5.27 5.55 57.48 0.00 57.48 57.19 -74.73
-
101.93
IM0032749
2:21:04
PM
3:16:06
PM
3:18:08
PM 2.03 55.03 57.07 38.02 0.00 38.02 35.99 -24.97 -30.95
IM0032927
9:11:20
AM
9:32:35
AM
11:54:09
AM 141.57 21.25 162.82 131.04 0.00 131.04 -10.52 -58.75 -18.23
IM0032930
9:31:36
AM
10:26:07
AM
11:57:14
AM 91.12 54.52 145.63 82.19 0.00 82.19 -8.92 -25.48 13.44
92
IM0032932
9:36:01
AM
10:26:32
AM
11:58:56
AM 92.40 50.52 142.92 234.20 71.4 305.67 141.80
-
100.96
-
212.76
IM0033121
3:13:51
PM
3:14:10
PM
3:38:02
PM 23.87 0.32 24.18 51.52 0.00 51.52 27.66 -79.68 -77.34
IM0033214
10:02:0
0 AM
1:50:00
PM
2:06:00
PM 16.00 228.00 244.00 209.01 40.7 249.74 193.01 107.27 -55.74
IM0033215
10:03:0
0 AM
10:40:00
AM
10:42:00
AM 2.00 37.00 39.00 41.28 0.00 41.28 39.28 -43.00 -52.28
IM0033291
9:43:00
AM
11:55:00
AM
1:50:00
PM 115.00 132.00 247.00 48.92 0.00 48.92 -66.08 52.00 148.08
IM0033301
10:26:0
0 AM
12:41:00
PM
1:53:00
PM 72.00 135.00 207.00 184.15 40.1 224.32 112.15 14.82 -67.32
IM0033323
9:42:00
AM
11:55:00
AM
1:07:00
PM 72.00 133.00 205.00 49.51 0.00 49.51 -22.49 53.00 105.49
IM0033371
10:21:0
0 AM
1:36:00
PM
1:44:00
PM 8.00 195.00 203.00 56.14 0.00 56.14 48.14 115.00 96.86
IM0033403
11:53:0
0 AM
2:33:00
PM
3:02:00
PM 29.00 160.00 189.00 39.93 0.00 39.93 10.93 80.00 99.07
IM0033500
1:01:00
PM
1:05:00
PM
3:36:00
PM 151.00 4.00 155.00 193.81 36.4 230.30 42.81
-
112.49
-
125.30
93
IM0033519
1:57:00
PM
2:48:00
PM
4:44:00
PM 116.00 51.00 167.00 30.86 0.00 30.86 -85.14 -29.00 86.14
IM0033700
8:25:00
AM
9:57:00
AM
1:32:00
PM 215.00 92.00 307.00 36.63 0.00 36.63
-
178.37 12.00 220.37
IM0033702
8:28:00
AM
8:41:00
AM
4:01:00
PM 440.00 13.00 453.00 53.57 0.00 53.57
-
386.43 -67.00 349.43
IM0033803
10:44:0
0 AM
1:09:00
PM
4:00:00
PM 171.00 145.00 316.00 77.45 0.00 77.45 -93.55 65.00 188.55
IM0033908
2:17:00
PM
2:39:00
PM
2:43:00
PM 4.00 22.00 26.00 73.72 40.0 113.81 69.72 -98.09
-
137.81
IM0033949
12:28:0
0 PM
2:03:00
PM
2:08:00
PM 5.00 95.00 100.00 50.97 0.00 50.97 45.97 15.00 -0.97
IM0033965
1:53:00
PM
2:38:00
PM
4:00:00
PM 82.00 45.00 127.00 213.20 72.7 285.98 131.20
-
107.78
-
208.98
IM0033989
1:59:00
PM
3:07:00
PM
3:56:00
PM 49.00 68.00 117.00 49.43 0.00 49.43 0.43 -12.00 17.57
IM0034186
10:08:0
0 AM
11:55:00
AM
12:47:00
PM 52.00 107.00 159.00 271.82
126.
6 398.44 219.82 -99.62
-
289.44
IM0034272
10:44:0
0 AM
1:35:00
PM
1:43:00
PM 8.00 171.00 179.00 34.92 0.00 34.92 26.92 91.00 94.08
94
IM0034340
12:21:0
0 PM
2:00:00
PM
2:42:00
PM 42.00 99.00 141.00 50.00 0.00 50.00 8.00 19.00 41.00
IM0034354
1:42:00
PM
1:43:00
PM
2:05:00
PM 22.00 1.00 23.00 151.97 86.7 238.74 129.97
-
165.77
-
265.74
IM0034411
1:38:00
PM
1:40:00
PM
3:43:00
PM 123.00 2.00 125.00 50.80 0.00 50.80 -72.20 -78.00 24.20
IM0034649
10:05:0
0 AM
11:53:00
AM
1:38:00
PM 105.00 108.00 213.00 39.13 0.00 39.13 -65.87 28.00 123.87
IM0034655
10:22:0
0 AM
10:23:00
AM
4:32:00
PM 369.00 1.00 370.00 49.49 0.00 49.49
-
319.51 -79.00 270.51
IM0034805
1:26:00
PM
3:07:00
PM
3:09:00
PM 2.00 101.00 103.00 146.29 59.5 205.80 144.29 -38.51
-
152.80
IM0034848
1:34:00
PM
2:56:00
PM
3:05:00
PM 9.00 82.00 91.00 57.39 0.00 57.39 48.39 2.00 -16.39
IM0035014
9:14:00
AM
11:56:00
AM
11:59:00
AM 3.00 162.00 165.00 41.66 0.00 41.66 38.66 82.00 73.34
IM0035093
11:24:0
0 AM
12:59:00
PM
3:51:00
PM 172.00 95.00 267.00 100.68 0.00 100.68 -71.32 15.00 116.32
IM0035171
12:56:0
0 PM
1:44:00
PM
1:46:00
PM 2.00 48.00 50.00 52.58 0.00 52.58 50.58 -32.00 -52.58
95
IM0035334
8:12:00
AM
8:12:00
AM
2:17:00
PM 365.00 0.00 365.00 107.85 0.00 107.85
-
257.15 -80.00 207.15
IM0035398
8:23:00
AM
8:30:00
AM
10:06:00
AM 96.00 7.00 103.00 50.35 0.00 50.35 -45.65 -73.00 2.65
IM0035423
11:05:0
0 AM
11:32:00
AM
11:56:00
AM 24.00 27.00 51.00 107.46 0.00 107.46 83.46 -53.00
-
106.46
IM0035544
11:23:0
0 AM
11:27:00
AM
2:31:00
PM 184.00 4.00 188.00 51.79 0.00 51.79
-
132.21 -76.00 86.21
IM0035786
9:32:00
AM
9:33:00
AM
9:51:00
AM 18.00 1.00 19.00 35.26 0.00 35.26 17.26 -79.00 -66.26
IM0035854
10:30:0
0 AM
11:56:00
AM
2:12:00
PM 136.00 86.00 222.00 66.19 0.00 66.19 -69.81 6.00 105.81
IM0035898
11:16:0
0 AM
11:56:00
AM
2:00:00
PM 124.00 40.00 164.00 109.05 10.7 119.76 -14.95 -50.70 -5.76
IM0035901
11:26:0
0 AM
11:56:00
AM
1:59:00
PM 123.00 30.00 153.00 118.81 36.3 155.17 -4.19 -86.37 -52.17
IM0035902
11:39:0
0 AM
2:33:00
PM
2:35:00
PM 2.00 174.00 176.00 105.10 42.1 147.27 103.10 51.83 -21.27
IM0035917
11:19:0
0 AM
11:32:48
AM
12:31:00
PM 58.20 13.80 72.00 57.29 0.00 57.29 -0.91 -66.20 -35.29
96
IM0036280
2:07:00
PM
2:53:00
PM
2:56:00
PM 3.00 46.00 49.00 48.26 0.00 48.26 45.26 -34.00 -49.26
IM0036302
1:00:00
PM
1:02:00
PM
3:16:00
PM 134.00 2.00 136.00 170.21 1.11 171.32 36.21 -79.11 -85.32
IM0036835
11:34:0
6 AM
1:15:11
PM
2:28:04
PM 72.88 101.08 173.97 71.02 4.29 75.31 -1.86 16.79 48.66
IM0037175
10:12:0
9 AM
10:13:10
AM
1:58:58
PM 225.80 1.02 226.82 95.98 53.4 149.41
-
129.82
-
132.41 27.41
IM0037362
2:11:11
PM
2:13:14
PM
3:22:44
PM 69.50 2.05 71.55 53.44 0.00 53.44 -16.06 -77.95 -31.89
IM0037558
8:27:52
AM
10:40:15
AM
11:51:12
AM 70.95 132.38 203.33 54.54 0.00 54.54 -16.41 52.38 98.79
IM0037628
10:46:5
6 AM
12:22:20
PM
2:32:52
PM 130.53 95.40 225.93 36.69 0.00 36.69 -93.84 15.40 139.24
IM0037629
10:48:2
9 AM
10:50:29
AM
10:56:29
AM 6.00 2.00 8.00 51.99 0.00 51.99 45.99 -78.00 -93.99
IM0037700
10:47:3
9 AM
3:29:29
PM
3:34:12
PM 4.72 281.83 286.55 222.61 0.00 222.61 217.90 201.83 13.94
IM0037910
7:39:35
AM
7:42:55
AM
5:17:09
PM 574.23 3.33 577.57 59.16 0.00 59.16
-
515.07 -76.67 468.40
97
IM0037911
8:09:58
AM
10:09:08
AM
11:29:02
AM 79.90 119.17 199.07 42.35 0.00 42.35 -37.55 39.17 106.72
IM0037961
10:42:2
7 AM
10:52:55
AM
2:19:06
PM 206.18 10.47 216.65 31.54 0.00 31.54
-
174.65 -69.53 135.11
IM0038269
7:37:43
AM
1:31:26
PM
2:02:49
PM 31.38 353.72 385.10 39.46 0.00 39.46 8.07 273.72 295.64
IM0038476
1:36:12
PM
4:24:47
PM
4:36:00
PM 11.22 168.58 179.80 39.44 0.00 39.44 28.23 88.58 90.36
IM0038676
10:28:0
9 AM
2:22:14
PM
2:22:26
PM 0.20 234.08 234.28 178.26 0.00 178.26 178.06 154.08 6.03
IM0038874
2:50:33
PM
3:59:26
PM
4:21:59
PM 22.55 68.88 91.43 31.67 0.00 31.67 9.12 -11.12 9.76
IM0038993
8:23:34
AM
11:57:56
AM
4:51:51
PM 293.92 214.37 508.28 65.70 0.00 65.70
-
228.22 134.37 392.58
IM0038997
8:59:30
AM
9:22:42
AM
12:05:45
PM 163.05 23.20 186.25 37.17 0.00 37.17
-
125.88 -56.80 99.08
IM0039018
10:58:2
9 AM
1:50:26
PM
3:59:15
PM 128.82 171.95 300.77 53.47 0.00 53.47 -75.34 91.95 197.29
IM0039317
8:50:38
AM
9:13:01
AM
9:29:22
AM 16.35 22.38 38.73 56.06 0.00 56.06 39.71 -57.62 -67.33
98
IM0039425
10:25:0
7 AM
10:26:12
AM
10:43:37
AM 17.42 1.08 18.50 48.72 0.00 48.72 31.30 -78.92 -80.22
IM0039432
10:46:1
8 AM
12:29:36
PM
1:45:59
PM 76.38 103.30 179.68 79.95 0.00 79.95 3.57 23.30 49.73
IM0039473
10:53:2
4 AM
1:31:23
PM
1:33:56
PM 2.55 157.98 160.53 125.10 0.00 125.10 122.55 77.98 -14.57
IM0039479
11:12:3
5 AM
1:31:47
PM
2:17:31
PM 45.73 139.20 184.93 61.92 0.00 61.92 16.19 59.20 73.01
IM0039496
12:31:2
2 PM
3:12:17
PM
3:21:43
PM 9.43 160.92 170.35 56.08 0.00 56.08 46.65 80.92 64.27
IM0039737
9:42:30
AM
9:43:45
AM
2:24:45
PM 281.00 1.25 282.25 37.43 0.00 37.43
-
243.57 -78.75 194.82
IM0039740
9:57:56
AM
9:59:39
AM
2:33:06
PM 273.45 1.72 275.17 296.50 0.00 296.50 23.05 -78.28 -71.33
IM0039752
9:56:11
AM
10:50:42
AM
12:09:01
PM 78.32 54.52 132.83 67.66 0.00 67.66 -10.66 -25.48 15.18
IM0039801
11:16:2
9 AM
1:29:59
PM
1:41:38
PM 11.65 133.50 145.15 187.91 0.00 187.91 176.26 53.50 -92.76
IM0039845
1:01:42
PM
1:02:46
PM
2:22:57
PM 80.18 1.07 81.25 41.36 0.00 41.36 -38.83 -78.93 -10.11
99
IM0039851
1:33:24
PM
2:20:57
PM
2:21:06
PM 0.15 47.55 47.70 53.51 0.00 53.51 53.36 -32.45 -55.81
IM0040061
2:35:38
PM
3:02:34
PM
4:58:38
PM 116.07 26.93 143.00 57.42 0.00 57.42 -58.64 -53.07 35.58
IM0040140
10:41:3
9 AM
10:42:58
AM
12:13:15
PM 90.28 1.32 91.60 40.83 0.00 40.83 -49.45 -78.68 0.77
IM0040269
3:20:26
PM
3:39:44
PM
3:55:58
PM 16.23 19.30 35.53 278.65 2.38 281.03 262.42 -63.08
-
295.49
IM0040366
10:16:1
4 AM
10:30:14
AM
11:53:03
AM 82.82 14.00 96.82 33.57 0.00 33.57 -49.24 -66.00 13.24
IM0040415
8:10:38
AM
8:11:12
AM
11:47:04
AM 215.87 0.57 216.43 62.09 0.00 62.09
-
153.78 -79.43 104.34
IM0040571
3:01:38
PM
4:06:41
PM
4:15:44
PM 9.05 65.05 74.10 114.27 0.00 114.27 105.22 -14.95 -90.17