building a performance system for it -...

16
White paper BUILDING A PERFORMANCE SYSTEM FOR IT

Upload: phamcong

Post on 18-May-2018

230 views

Category:

Documents


1 download

TRANSCRIPT

White paper

building a performance system for it

Table of contents

Executive summary 3Key requirements for an IT performance system 4The anatomy of a performance system 4

Governance 4Delivery 4Measurement 4

Building a performance-oriented culture 5The HP IT Performance Suite 5The HP IT Executive Scorecard: a system of measures 6Establishing a common data model 7Building and using your system of measures 8Key takeaways 8Suggested next steps 8Appendix: key performance indicators 9

Customers 9IT value 10Future orientation 11Operational excellence 12

Executive summary

Every leader in today’s modern enterprise should have a performance system—a means to transparently measure and manage progress against the goals they are given. With $5 trillion per year invested in enterprise IT, it is essential that each IT leader has a digitized system that enables them to manage and improve the key performance indicators (KPIs) that matter to the organization.

This white paper documents the requirements of an effective IT performance system and introduces a set of associated measures and best practices that are both aspirational and practical. In addition, the paper outlines a framework of key performance indicators to help IT perform better and presents specific KPIs to consider for leaders within different IT domains.

4

Narrative

Governance and alignmentCustomer Sat. Future Orientation IT Value Operational

Excellence

Delivery - people, processes and tools

Measurement and relationships

Narrative

IT staff Backup

Data ModelPerformanceData

Testing ServerAutomaton

Etc... Processes

Figure 1. Elements of an IT performance system.

Key requirements for an IT performance systemTo be effective, an IT performance system must meet three requirements:1. Comprehensive. Your CIO and IT leadership team

collectively require oversight and control across all of the people, processes, and systems in the IT domain, from infrastructure to information, applications, and security, and from planning to development and operations. You cannot manage what you cannot measure, so comprehensive visibility into the performance of the entire IT investment portfolio is required for success.

2. Connected. To make informed decisions, IT leaders need a dynamic performance system comprised of IT management and security tools that not only automate their discrete functions, but also deliver insight by feeding digital updates to their KPIs in a timely manner. IT leaders should be able to view and share their KPIs in a series of connected and cascading scorecards so that each leader’s KPIs support their manager’s up to the CIO level.

3. Flexible. An IT performance system needs to provide the flexibility to enable your IT leaders to set and evolve the KPIs that map to their unique priorities, create scorecards that align with their organizational structures, and select the underlying IT management tools that automate and run inside their diverse and heterogeneous operating environments, without creating vendor lock-in.

The Balanced Scorecard: Translating Strategy into Action. Robert S Kaplan, David P. Norton, 1996. ISBN 978-0875846514

https://www.isaca.org/ http://www.itsmfi.org/ http://www3.opengroup.org/ https://www.issa.org/ http://www.edrm.net/

The anatomy of a performance systemAn IT performance system consists of three key functional layers: governance, delivery, and measure-ment, as indicated in Figure 1.

GovernanceThe process of direction and evaluation of IT from a business outcomes perspective. This is also where performance objectives are agreed and documented, establishing a balanced scorecard comprised of key performance indicators which are set and tracked for variance from plan.

By providing a written as well as visual history of progress against goals, a scorecard enables your stakeholders to maintain the “story behind the numbers,” or narrative. This is especially important when performance goals are not being met. In these situations, a documented narrative serves as a valuable record that can help isolate root-cause information and/or historical context that would otherwise not show up in a purely data-driven dashboard.

Delivery This is where performance is improved, sustained, or reduced. The delivery aspect of a performance system encompasses the people, processes, and technology involved in the day-to-day delivery of new and existing IT services. There are three critical subsets inside every organization:

• People—including employees, contractors, and vendor partners.

• Processes—typically a hybrid of organization-specific best practices, industry-standard frameworks (such as COBIT, ITIL, TOGAF, ISSA, EDRM), and enterprise- specific processes and cultural norms, which can be documented and/or formalized within an IT performance system.

• Technology—the software systems used by IT leaders and their teams to automate, monitor, and measure specific IT functions. Typically tools vary by role or function. A chief information officer (CIO) will likely focus on high-level tools for strategy, planning, and governance, including tools for financial planning and analysis, portfolio and asset management, and workforce management. A vice president (VP) of applications and his or her organization will likely use tools for application lifecycle management, including quality assurance, project and portfolio management, and business service management. The VP of operations, chief information security officer (CISO), and VP of information and their teams will similarly use specialized tools for ensuring the success of their functions.

MeasurementThis is the functional layer where performance metrics are created and performance data is stored. Continual improvement also requires that IT establish a common data model which enables the consolidation of facts from disparate sources. To allow for exception-based management and continual improvement, these measure-ments should be made consistent such that they can be compared across the processes, systems, and functional teams outlined above.

5

Figure 2. The HP IT Performance Suite

Building a performance-oriented cultureIf you are establishing a performance system, fostering a performance-oriented culture is a critical success factor. A performance system typically brings with it a level of transparency of results that, unless suitably prepared, can intimidate employees and in some cases create behaviors that lead to poor outcomes.

For example, your organization might seek to measure percentage (%) first call resolved rates, a common indicator of operational excellence within a help desk environment. However this goal, if applied without establishing a percentage (%) incidents reopened or customer satisfaction KPI, may lead to help desk staff closing incidents prematurely in order to meet their target at the expense of the desired result—in this case, a quality outcome and a satisfied customer.

As a result of both of these factors, you should plan for goals to be implemented as part of a balanced system of measures and accompanied by a structured organiza-tional change management program.

Execute systematically

Store efficiently

Secure proactively

IT Performance Suite foundation

Build faster Operate simply

Strategy, planning and governance

Security intelligence and risk management

Collaboration, orchestration and analytics

Application lifecyclemanagement

Operationsmanagement

Information lifecyclemanagement

The HP IT Performance SuiteBased on years of experience working with enterprises around the world, HP has created the HP IT Performance Suite, (see Figure 2), a comprehensive, connected, and flexible portfolio of software and best practices that enables your IT leaders to run IT as a business. The suite is a complete IT performance system that provides you with the confidence and insight to always perform better.

Enabling the systematic execution of cascaded goals across the HP IT Performance Suite is the HP IT Executive Scorecard, a single pane of glass into the performance of your entire IT portfolio.

6

Figure 3. Four key reporting areas of an IT Executive Scorecard.

Commondata

modelIT valueOperational

excellence

Futureorientation

Customers

The HP IT Executive Scorecard: a system of measuresCritical to delivering better performance is the ability to set, cascade, and measure KPIs such that they become part of the day-to-day behaviors of your IT organization. HP research has identified 170 KPIs associated with world-class IT service spanning all key IT processes and organizations. Once set, each KPI can then be cascaded through your organization to drive alignment to the performance initiatives of the entire team, effectively establishing a common language to describe IT performance.

The IT Executive Scorecard automates the collection, presentation, and tracking of a growing number of KPIs that populate a system of scorecards that can cascade up and down the IT leadership team. You can compose your own performance system from a KPI library or start with one of the persona-specific editions HP has created for the CIO and key IT function leaders, and then extend it as needed.

Each HP IT Performance Suite edition features the HP IT Executive Scorecard software bundled with a set of software tools that improve domain-specific performance and automatically populate the KPIs with up-to-date status. This approach is designed to provide rapid benefits realization in terms of measuring and improving your performance along dimensions of cost, quality, innovation, and your organization’s readiness to address future demands.

The data is organized into the four categories: IT value, customers, operational excellence (including risk and security), and future orientation, all of which are driven from HP’s common data model based on the HP Universal Service Model architecture (Figure 3).

The IT value element of an IT balanced scorecard seeks to capture and categorize business and IT alignment at the highest level, primarily around areas such as alignment with business strategy, the stewardship of the overall IT spend, and improved return on investment as a function of value versus cost.

The customers element of the IT balanced scorecard tracks improvement or declines in metrics that impact customer satisfaction, such as service-level attainment, as well as qualitative metrics, such as customer survey results. Highly correlated to customer satisfaction are service availability and the duration, mean-time-to-repair (MTTR), and number of open incidents associated with a service.

The operational excellence element of the IT balanced score card seeks to track and maximize the throughput of planned work, such as new projects. This element focuses on streamlining and automating processes for unplanned or reactive work, such as help desk incidents, as well as metrics that reflect the quality and effectiveness of the broader delivery process, such as measure of rework, escalations, and work backlog. Here also, you track security, risk, and compliance metrics.

Finally, to enable your IT leaders to better respond to change, the IT balanced scorecard tracks the future orientation of the IT function. In addition to tracking information about the age and complexity of major IT systems, the CIO edition also tracks measures related to human capital, such as the relative contribution of the external versus internal labor required to respond to enterprise IT demands, which provides a measurable proxy of internal readiness.

7

Figure 4. Delivering KPI-based Scorecards on top of an Open Data Model

Finance Configuration Businessservices

ApplicationLifecycle

Management

InformationLifecycle

ManagementRisk andSecurity

OperationsManagement

Qualitative

Executive scorecard

Administration

KPI library KPI engine Collaboration

Scorecards, KPIs/metrics, dashboard pages

Deployment

Configuration

Security Data integration, consolidation, stewardship

Open data model

CustomerAssetProject ContractChange

IncidentCost ServiceSLA

Vendor

XLS

HP, along with a number of Fortune 500 companies, has also led the creation and publication of an open, industry-standards-based model for configuration meta-data (refer: open data model). This data model allows you to formalize, specify, and query the relationship between business processes, applications, infrastructure, and service providers in a consistent manner using industry standards, such as TQL and XML.

Further information on the open data model can be found in the references section of this document.

Establishing a common data modelA comprehensive and connected IT performance system requires a consistent data model that enables consolidation, sharing and comparison of trusted metrics which form a basis for continuous improvement. When establishing a common data model, it is important to use a comprehensive model that is independent of any particular source system and open to any and all source systems, so that the entirety of IT can be covered, including today’s public Cloud service providers, and so that source changes do not affect performance history as seen by the governance user.

The primary consideration when importing measurement data is enabling consistency across data sources. In order to support both quantitative- and qualitative-based data, the HP IT Performance Suite ETL architecture allows for data to be imported and normalized across both traditional SQL databases and flat files, such as spread-sheets or tabular data.

8

Building and using your system of measuresHaving established a functioning data model and structured a balanced scorecard, the next step is to determine what KPIs to track at what level of the IT organization.

While there are no hard rules, there are a number of best practices to consider as you structure your own system of measures and scorecards:

1. Less is more. A few, well selected KPIs on each leader and individual contributor’s scorecard is more likely to be successfully executed than a comprehensive but overwhelming one.

2. IT is a team sport. Scorecards and associated KPIs should cascade clearly across the IT leadership team (see Figure 5). Many scorecards reinforce silos within IT by creating isolated KPIs for each functional team in the delivery chain. While only one functional team should be accountable for attaining specific KPIs, IT leaders should ensure at least one of the cascaded goals is heavily weighted toward the end-to-end attainment of enterprise outcomes.

3. You get what you inspect, not what you expect. A transparent rewards and measurement system can have a powerful effect on driving behavior. Where possible, ensure that metrics are balanced such that behaviors reflect the broader objectives of the organization.

4. Focus on benchmarks and continuous improvement. Benchmark and continuously improve—equally important to defining what KPIs to measure is establishing the benchmarks and goals. KPIs that are unrealistic reduce the IT team’s confidence of achieving them—equally, KPIs that are too easy to achieve may translate into sufficient effort put into improvement. By constantly testing and measuring, comparing inside the organization and outside against industry peers, you can set benchmarks to drive continuous improvement in a meaningful manner.

ProjectManagement

Office

Office ofthe CIO

CIO

BusinessAnalyst

VP ApplicationDelivery

QualityAssurance

NetworkManager

Service DeskManager

VP Infrastructure& Operations

Chief InformationSecurity Officer

Development

Figure 5. An IT Performance System provides scorecards that cascade to multiple levels of IT leadership.

Key takeawaysIn order to always perform better, CIOs and IT executives need a digitized and automated system that continuously measures, communicates, and improves IT performance by instrumenting the entire IT function in a comprehensive manner. To gain timely insights and provide fact-based evidence of how IT supports the business, it is essential that such a performance system dynamically updates KPIs and is connected to every element of the IT landscape.

The IT performance system should provide the flexibility for IT to move fast, leverage existing capabilities, operate in a heterogeneous landscape, and exploit new technology and opportunities to drive better business outcomes.

The HP IT Performance Suite not only meets these core requirements—comprehensive, connected, and flexible—but also offers your IT leaders rapid deployment editions for KPIs and scorecards to accelerate their time to insight and promote greater confidence.

Suggested next stepsIf you are interested in establishing an IT performance system, you can select from a range of HP workshops, including:

• IT Performance Management Workshop that accelerates alignment around an IT end-state vision required to deliver the desired enterprise outcomes in the context of the current shifts in IT production and consumption models (four-hour workshop onsite, with follow-up)

• Value Discovery Workshop that develops a roadmap for key transformational IT initiatives, including people, process, technology, and related KPIs (approximately four days onsite)

• Readiness Workshop that assists in building out process-and-tool specific implementation plans for better business outcomes (one to two days onsite)

If you are interested in sharing and improving best-practices with your peers, join the Discover Performance Community: www.hp.com/go/discoverperformance/reg

9

Table 1. Customers

Solution Domain Optimization Opportunity/KPI CIO VP of Operations VP of Applications VP of Information Management

Chief Security Officer

Applications % deviation from planned hours worked x x x

Applications % of healthy projects x x x

Applications % of project tasks on time x x x

Applications % of projects on time x x x

Applications % of projects with unresolved urgent issues x x x

Applications Customer satisfaction via survey (sometimes in lieu of timeliness and/or budget KPIs) x x x

Information Management Improve user access to information x

InformationManagement Time to respond to e-discovery x

Operations # unplanned outages per month x x

Operations % of IT budget spent on maintaining existing systems x x

Operations % of met SLAs x x

Operations % of met SLOs for IT process Activities x x

Operations % of met SLAs x x

Operations % of met SLOs for IT process Activities x x

Operations % of outages/total SLA uptime over time x x

Operations % of satisfied customers x x

Operations % of service availability x x

Operations % of service performance not met x x

Operations Availability & performance SLAs of mission critical applications x x

Operations Avg. outage duration per incident x x

Operations Customer satisfaction for help desk x

Operations Hours monthly downtime x x

Operations Mean time to implement changes x

Operations Mean time to repair incidents x

Operations Mean time to resolve incidents & alerts x x

Operations MTBF of business services x x

Operations MTTR of business service incidents x x

Appendix: key performance indicatorsHP research has identified a number of KPIs frequently associated with measuring performance at leading IT organizations across the domains of customers, IT value, operational excellence, and future orientation. The actual number and specific KPIs implemented by customers will differ based on the performance framework they design as well as their unique culture and performance initiatives.

Note: We have not provided targets and best practice levels for KPIs, as they vary by industry and need to be tailored to each customer’s circumstances. Also, in some cases, there are multiple ways to measure a KPI (such as customer satisfaction) and this paper does not prescribe a single formula for measuring a KPI. HP and our partners, as well as third parties, offer workshops and other programs to help customers select KPIs, determine appropriate measurement criteria, and apply best practice benchmarks.

Personas

10

Table 2. IT value

Solution Domain Optimization Opportunity/KPI CIO VP of Operations VP of Applications VP of Information Management

Chief Security Officer

Applications % of change in projects cost x x x

Applications % of projects - cost reduction x x x

Applications % of projects associated with business objectives x x x

Applications % of projects budget at risk x x x

Applications Innovation - hours worked on new vs. maintenance x x

Applications On budget variance x x x

Financials % of actual vs planned projects cost x

Financials % of capex vs opex spending x

Financials % of change in business service cost x x

Financials % of IT POR vs. total revenue x x

Financials % variance of actual vs. planned costs x x

Financials Avg cost of IT—delivery per customer x x

Financials Business service cost reduction x x

Financials Innovation delivery x x x

InformationManagement % of data/information stored with 3rd parties x

InformationManagement

% of information management processes that are automated x

InformationManagement

Customer satisfaction with information management processes x

Information Management

Past budget vs. current budget for data/information stored in cloud x

Information Management Fully loaded cost per terabyte x

Information Management ROI analysis x

Operations % of assets—cost reduction x x x

Operations % of software license in use x x x

OperationsAvailability & performance SLAs of mission critical applications, especially cloud providers (How close am I to breaching a contract?)

x

Operations Cost of monitoring per application x

Operations % of IT budget spending on opex x x

Operations Revenue loss per outage x x

Operations The revenue impact of a business service outage x

Personas

11

Table 3. Future orientation

Solution Domain Optimization Opportunity/KPI CIO VP of Operations VP of Applications VP of Information Management

Chief Security Officer

Applications % of employee utilization rate x x

Applications % of FTE x x x x x

Applications % of project effort done by external resources x x x x x

Applications % of project effort done by external staff x x x x x

Applications Employee retention/morale x x

InformationManagement

Litigation readiness to support in-house, automated, repeatable e-discovery process x

Operations % of employees exceeding leadership competency model x x x x x

Operations % of satisfied employees x x x x x

Operations Employee turnover x x x x x

Personas

12

Table 4. Operational excellence

Solution Domain Optimization Opportunity/KPI CIO VP of Operations VP of Applications VP of Information Management

Chief Security Officer

Applications Avg delivery time for new products or services x x x

Applications Avg project initiation time x x x

Applications Effort variance x

Applications Nimbleness—Mean duration from change request to production x x

Applications On time delivery variance x x x

Applications Quality (defect leakage, normalized by % per function point or hours worked) x

Applications Resource utilization x x

Applications Time-to-market of new products or services x x x

Information Management # of data leakages/losses x

Information Management # of past platforms vs. current # of platforms x

Information Management

# of sanctions or penalties for non-compliance with discovery or regulatory request x

Information Management

% improvement of staff productivity and process improvements x

Information Management

% of information assets governed by proper access controls x

Information Management

% of legacy data classified as expired and disposed of x

Information Management % of records classified with retention policy x x

Information Management % of storage mapped to tiered storage profiles x x

Information Management

% of tests/simulations passed measuring impact on business services through policy non-compliance

x

Information Management

% of user information/data backed up under retention compliance policies x

Information Management % virtual server/storage implementation x

Information Management Active user processes x

Information Management % of business records with retention policy x

Information Management Compliance audit x

Information Management Compliance/regulatory mandates supported x

Information Management

Control storage capacity with deduplication, compression and storage tiering x

Information Management

Dispositions scheduled, completed by number and storage size x

Information Management

Hours worked/time spent searching for information x

Information Management

Implement policy based information archiving strategy x

Information Management Ingestion volumes x

Personas

13

Table 4. Operational excellence—continued

Solution Domain Optimization Opportunity/KPI CIO VP of Operations VP of Applications VP of Information Management

Chief Security Officer

Information Management Number of Data sources x

Information Management Past storage cost vs current storage cost x

Information Management Pursue/implement cloud services strategy x

Information Management

Current and tested backup and recovery architecture x

Information Management

Retention schedules, by number and storage size x

Information Management Tier 1 storage growth x

Information Management Total records/documents/objects stored x

Information Management

Upgrade disaster recovery and business continuity capabilities x

Information Management

Headcount ratio—storage capacity to one admin x x

Information Management Mean time to identify storage bottleneck x x

Information Management Storage capacity reclaimed x x

Operations % events processed without human intervention x

Operations % of assets in maintenance x x

Operations % of assets returned to supplier x x

Operations % of changes resulting in outage x x

Operations % of devices compliant with corp policies x

Operations % of devices compliant with regulatory policy x

Operations % of emergency changes x x

Operations % of escalated Incidents x x

Operations % of FCR x x

Operations % of incident aging x x

Operations % of interactions in backlog x x

Operations % of network service availability x

Operations % of outages due to changes x

Operations % of problems resolved within the required time period x

Operations % of reopened Incidents x x

Operations % of SLA expirations x x

Operations % of SLAs coverage x x

Operations % of unauthorized implemented changes x x

Operations % of unplanned changes x x

Operations % of urgent changes x x

Operations % problems with RC x

Operations % servers under management x

Operations % storage utilization x x

Operations Application (custom app) provisioning (hrs.) x

Personas

14

Table 4. Operational excellence—continued

Solution Domain Optimization Opportunity/KPI CIO VP of Operations VP of Applications VP of Information Management

Chief Security Officer

Operations Availability of network devices x

Operations Average age of hardware assets x

Operations Average time to procure x x

Operations Time needed to achieve, percentage correct x

Operations Change, configuration management Accuracy x

Operations Closure duration x x

Operations Database provisioning (hrs.) x x

Operations Ensuring optimal connectivity x

Operations Headcount ratio—client devices to one admin x

Operations Headcount ratio—databases to one DBA x x

Operations Headcount ratio—network devices to one admin x

Operations Headcount ratio—servers to one admin x

Operations Infrastructure utilization (server, storage) x

Operations Mean time between failures x

Operations Mean time to resolution for network incidents x

Operations Mean time to restore services x

Operations Mean time to resolution for network incidents x

Operations Network efficiency/latency x

Operations Network provisioning (hrs.) x

Operations Number of decommissioned servers x

Operations % of IT budget spending on opex x x

Operations Physical server provisioning (hrs.) x

Operations Problem queue rate x

Operations Storage provisioning (hrs). x x

Operations Virtual server provisioning (hrs.) x

Operations VM to server ratio x

Operations Changes to backlog size x

Security Number of security and compliance (corporate, government, industry) violations x

Security Security and compliance (corporate, government, industry) x

Security Simulation/change & configuration impact on security and compliance x

Security Status of organizational controls for each compliance area x x x x x

Operations Status of compliance to internal policies x x x x x

Operations Device coverage (%)—devices protected by security controls x x

Operations Number of security incidents open/closed, response time x

Operations Patch latency (avg missing patches/machine, days old) x x

Operations Status of on-going “posture” improvement efforts as aligned to priorities and objectives x

OperationsPlatform compliance (0-10)—ports open, default passwords/policies, etc. mapped to CIS controls

x x

Personas

15

Table 4. Operational excellence—continued

Solution Domain Optimization Opportunity/KPI CIO VP of Operations VP of Applications VP of Information Management

Chief Security Officer

Operations Regulatory controls—top out-of-compliance controls x

Operations Level of risk by asset class (quarterly) x

OperationsAccepted loss expectancy (annual, by asset class, $ revenue)

x

Operations Safeguard implementation status x

Operations Total accepted cumulative risk (dynamic measure) x

Operations Data loss prevention controls x x x

Operations Status of security project activities and remediation/cases (time/budget) x

Personas

Share with colleaguesGet connectedwww.hp.com/go/getconnected

Get the insider view on tech trends, alerts, and HP solutions for better business outcomes

© Copyright 2011 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein.

4AA3-8565ENW, Created November 2011