user serice performance overview

14
284 23-3106 Uen Rev A User Service Performance February 2007 White paper An optimized selection of specified system service performance indicators will increase the ability to guarantee user perceived service performance. User Service performance 284 23-3106 Uen Rev A © Ericsson AB 2007 Public 2 (17) Contents 1 Executive summary....................................................... ........................ 3 2 End to end User and System Service performance ........................... 4 2.1 Market and telecom trends ....................................................... ............... 4 2.2 User and System Services ..................................................... ................. 5 2.3 Prerequisites for user & system performance.......................................... 6 2.4 End to end data collection ................................................... .................... 7 3 Selection of System Service KPIs – S- KPIs ........................................ 8 3.1 Using a service model ........................................................ ..................... 9 3.2 QoSS – a new pragmatic level between QoS and QoE. ....................... 10 4 User Service Performance Assurance............................................... 11 4.1 User Service performance assurance baseline ..................................... 12 4.2 Service Performance monitoring ................................................... ........ 12 5 Operator value ........................................................

Upload: dion132

Post on 16-Dec-2015

216 views

Category:

Documents


3 download

DESCRIPTION

User service performance overview

TRANSCRIPT

284 23-3106 Uen Rev A

User Service Performance

February 2007

White paper

An optimized selection of specified system service performance indicators

will increase the ability to guarantee user perceived service performance.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

2 (17)

Contents

1 Executive summary............................................................................... 3

2 End to end User and System Service performance ........................... 4

2.1 Market and telecom trends ...................................................................... 4

2.2 User and System Services ...................................................................... 5

2.3 Prerequisites for user & system performance.......................................... 6

2.4 End to end data collection ....................................................................... 7

3 Selection of System Service KPIs S-KPIs ........................................ 8

3.1 Using a service model ............................................................................. 9

3.2 QoSS a new pragmatic level between QoS and QoE. ....................... 10

4 User Service Performance Assurance............................................... 11

4.1 User Service performance assurance baseline ..................................... 12

4.2 Service Performance monitoring ........................................................... 12

5 Operator value ..................................................................................... 13

6 Conclusion ........................................................................................... 15

7 Glossary ............................................................................................... 15

8 References ........................................................................................... 17

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

3 (17)

1 Executive summary

Telecom services are part of our daily lives, and it is crucial that they perform to

expectations. In the maturing global telecom market, service performance and timeto-

market are key differentiators (Ref 1&2).

A major challenge is the immense amount of data that is produced by a telecom

network and many hundreds of subscriber services. There are no standard or

guidelines on which indicators to choose, which means there are a variety of service

performance indicators in use in the telecom community.

This whitepaper outlines a structured approach to system service performance so

that user service performance can be properly measured

Understanding the complexity of user service performance requires detailed

knowledge of system service performance. Examples of system services are MMS,

streaming and TV services. System services are user-independent and are the

building blocks for numerous user services such as news, music and games in

order to secure the best performance.

Achieving excellent service performance requires a top-down approach, where the

selection, evaluation and approval of adequate performance indicators are carried

out system end-to end and before service launch.

For each system service, a vital few key performance indicators (KPIs) are carefully

selected that best reflect the service performance from the user perspective. These

KPIs need to be complete, comparable and similar in their implementation, and are

named System Service KPIs (S-KPIs). They must be constantly evaluated to ensure

sufficient performance over the service lifecycle. S-KPIs are documented per service

in an S-KPI Index - an SPI.

This structured approach forms a Service Performance Assurance Baseline, and

provides telecom players an agreed standpoint from which they can discuss user

performance for SLA agreements, Telecom Management and O&M purposes.

Defining and agreeing such a baseline gives a new firm level against which

performance can be monitored. With a measurable system service level, the Quality

of System Service (QoSS) can be set offering a cost-efficient simplification and a

pragmatic level that also paves the way for a Quality of User Service (QoUS) level.

A predictable service performance level enables fast launches at reduced cost and

attracts new subscribers. This ensures the correct focus from the start, as well as a

sustained high-quality brand, market share and capability of providing the right

proposition at a valid price with fast return on investment.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

4 (17)

2 End to end User and System Service

performance

2.1 Market and telecom trends

The publics knowledge and experience of telecom services is increasing, and so are

expectations of service performance. In the maturing telecom market, in which new

services are created and deployed ever faster, users increasingly judge operators by

their service portfolio and service performance.

Overall network evolution is driven by three factors: cost savings; new user services

and values; and evolving business models and interfaces. For example, one major

evolution step is the introduction of IP Multimedia Subsystem (IMS), driven by

network rationalization (VoIP), as well as services such as multimedia

communications, interactive personalized TV, and push-to-talk.

Simultaneously, access network capabilities are improving all the time, enabling

more services to be delivered. It is increasingly complex to integrate service creation

and delivery into a network, while at the same time the window of opportunity is

shrinking for each new feature. This call for a strategy of the network functionality:

hiding as much as possible of the network complexity.

The structure promises the user service transparency, service continuity and being

always best connected. Considering the diversity and richness of devices, access

technology and networks, we need to simplify matters for the user as well as for the

operator. The evolved architecture enables every access network to deliver its full

potential, while at the same time ensuring that service implementation only requires

knowledge about the specific access characteristics (such as performance

indicators).

Network elements in an operators network can today produce anything from 40 up to

4,000 different raw counters that describe its behavior. With the foreseen architecture

evolution, the number of counters will explode.

Telecom operators therefore need KPIs specified both within their systems and

managed service contracts, in order to monitor and measure system service

performance.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

5 (17)

2.2 User and System Services

Hundreds of user services are today offered by operators therefore making

performance monitoring of all services quite complex. Addressing this complexity

requires knowledge of the system services and system service performance.

Separation of services into user and system services helps in defining what to

measure and monitor.

Figure 1) User and System Services, ex. CNN with content require system services Web

Browsing & Mobile TV.

A system service is user-independent and can therefore be standardized. System

services are the building blocks for numerous user services. Concentrating on the

performance of system services enables well-defined and cost-efficient monitoring of

all services.

A system service is a combination of access, core and service domain infrastructure

and related terminal equipment. Examples include MMS, mobile TV and video

telephony.

A user service consists of one or more system services, including the actual terminal

used.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

6 (17)

2.3 Prerequisites for user & system performance

User and system service performance indicators, and their definitions, are based on

well-known and accepted standards (Ref. 8&9). Figure 2 shows the different

performance levels that have been defined.

Figure 2) End to end definitions performance see text below for explanations

The Bearer Service comprises the radio and core network involved in service

delivery.

System Service (QoSS) is the combination of network and nodes involved in the

service delivery, including signaling performance of the terminal.

Operator Perceived Service is how the operator perceives the system service,

including its own network/service management systems.

User Service (QoUS) is how the user perceives the service, including the actual

terminal used (display and speaker).

Quality of Experience (QoE) is how the user perceives the service, including operator

and customer services, based on expectations of the offered services

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

7 (17)

2.4 End to end data collection

Measuring the end-to-end service quality for every session and every user requires a

combination of data from several sources, including:

infrastructure data alarms and performance data from network and nodes

traffic data information from ongoing commercial traffic

User data user-perceived performance.

Figure 3) End to end Data collection

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

8 (17)

3 Selection of System Service KPIs S-KPIs

The immense number of available KPIs (as illustrated in Figure 4) presents a huge

challenge in meeting requirements for user service performance, service assurance

and Service Level Agreements (SLAs). What is more, vendors, operators and service

providers rarely choose the same KPIs.

Figure 4) Immense no of KPIs are available from the network.

A first step in KPI harmonization is an initiative in which a group of WCDMA KPIs

have been drafted to measure the performance of the WCDMA access network (Ref

7). Other initiatives are under way within the ITU (Ref.1 & 2).

In order to be able to discuss system service performance, and even user service

performance, a top-down approach is required. Telecom players need to select and

agree the vital few KPIs for each standard system service.

By selecting the system service KPIs that best presents the expected service

performance from a user perspective, a baseline for monitoring service performance

is created. For this purpose, the trigger point must be defined for the system service

as well as the actual user service. Concentrating on the performance of system

services enables well-defined and cost-efficient monitoring of all services.

To distinguish them from KPIs in general, the selected service KPIs are known as

System Service KPIs (S-KPIs).

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

9 (17)

Figure 5) End-to-end selection of a maximum 10 S-KPIs per service

It is crucial that the selected performance indicators are well defined and agreed at

an early stage in order to meet user expectations. An early agreement of S-KPIs

enables a structured dialogue between customer and vendor. Today this dialogue is

often prolonged by the delicate and often tedious selection of which performance

indicators are the best use, and then the necessary adaptations to enable

acceptance of those chosen.

3.1 Using a service model

A service model (as illustrated in Figure 6) is needed to show the relations between

system services, KPIs and available system performance information in order to

understand and present a true user view. That means mapping all S-KPIs covering

the servability areas accessibility, retainability and integrity to the information

available from alarms, counters, event logs, traffic and transaction information

throughout the whole system. Accessibility is how easy it is to start a service;

retainability is the ability to keep the service going; integrity is the quality of things like

throughput and video.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

10 (17)

Figure 6) Service Model for collection of S-KPIs for a system service

3.2 QoSS a new pragmatic level between QoS and QoE.

Measuring QoE is very difficult, given the various factors that affect users perception

of service quality. QoE embraces network coverage, service offers, support levels

and other subjective factors like price. Many of these factors are in the sole hands of

the operator or service provider.

QoS is a technical concept within the network and network elements. Measurements

cannot be used as a basis for discussing user service performance, since QoS is

more a description of system capabilities.

The telecom markets demands for high service performance call for another level of

quality measurement between QoS and QoE.

The effort here to define and in principle standardize a system service level and

related S-KPIs provide a firm, measurable Quality of System Service (QoSS) level.

This is a cost-efficient simplification and pragmatic level that fills the gap between

QoS and QoE. QoSS enables the monitoring of a Quality of User Service (QoUS)

level.

To come as close as possible to the user perception of the service, considerable

effort is needed when defining the system services performance indicators (S-KPIs).

At the same time, the concept must guarantee a clear baseline for SLAs, OSS and

Telecom Management.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

11 (17)

There are some golden rules for selecting the vital few system service performance

indicators. The first is to choose S-KPIs with a system end-to-end focus and from the

user point of view no more than ten per system service. Trigger points should be

defined for the system and the user service if possible. All S-KPI names should be

access- and system-independent, and a tool-independent measurement description

should be included. Standards must be specified with some additional adaptations to

local rules. A common S-KPI structure is:

( s, %, m...)

The S-KPI description must include a measurement specification that gives detailed

information of the access technology used. The handling of active and passive

measurements can be specified when applicable signaling sequences for probebased

measurements can be defined. A measurement survey is needed to specify

what tools to use in different measurement scenarios.

With careful selection and agreement of S-KPIs, performance indicators will be

complete, similar and comparable in their implementation. A constant evaluation of

the S-KPIs is needed to ensure sufficient performance during the service lifecycle.

4 User Service Performance Assurance

The system service level, mapped to user services, the optimized selection, definition

and finally the specification of S-KPIs is a step by step approach. The approach

paves the way to form a user service assurance baseline with accurate S-KPI

descriptions, Service Performance Index (SPI) and related service performance

monitoring areas.

To be able to assure a user performance level in a specific timeframe the S-KPIs for

a specific system service is documented in an S-KPI Index SPI. The SPI includes

S-KPI values in a specified environment and time frame. It could also include

possible monitoring and measurement tools.

With the SPI the Quality of System Service level is documented to be used also for,

SLA agreements, service monitoring, Telecom Management and O&M purposes.

The documented and measurable QoSS level paves the way also for addressing the

QoUS.

The baseline, relation to the service model and how it can be presented for service

monitoring purposes provide increase the ability to assure user-perceived service

performance.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

12 (17)

4.1 User Service performance assurance baseline

The user service assurance baseline includes documentation of

required user service mapped to system service.

selected S-KPIs

SPI

The baseline is to be used for:

product description, which S-KPIs to be used at Integration &Verfification,

acceptance test and O&M (different values).

writing Performance Monitoring guides recommending what to measure

hourly, daily, weekly to secure service quality

defining O&M service reports

performance reports/SLA:s at Managed and hosted Services.

monitoring in Audits and at System Integration/Service Assurance

reporting of Service Performance

4.2 Service Performance monitoring

The main users of service quality information in an operators organization are

customer support, service and network operation centre and service and market

responsible managers. They all need a unique set of reports to be able to supervise

and tune the service delivery process to secure good service quality. The S-KPI

definitions form the basis of all this information.

The reports must be accessible remotely and cover areas such as service availability

in real-time and SLA views, S-KPI values and trends, GIS view of service quality and

distribution of used services. The service model should be accessible online and

used when pinpointing trouble areas when the S-KPI value goes beyond defined

thresholds.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

13 (17)

Figure 7) Service Monitoring view

5 Operator value

The knowledge gained from system service key performance indicators can give

operators a vital edge in the competitive global market. The opportunity to improve

end-user perceived service quality can mean faster and higher revenue per user

service. These new tools can be a major advantage when negotiating service

performance through SLAs with third-party content and service providers.

In many advanced markets, the effect of service performance quality on market

share, traffic growth and time to revenue are well known. Service performance that

matches or exceeds user expectations can also decrease helpdesk costs.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

14 (17)

Figure 8) Potential Mobile TV barriers, according to users in a Consumer lab study.

Service performance is important.

On the other hand, poor performance jeopardizes business: costs increase and

return on investment will be squeezed and probably reduced. Symptoms of poor

performance include later general service availability and later market introduction.

Costs are pushed up by the need for additional measurements and testing, and late

design improvements, for example. Another sign is a later take-up of the service, as

poor performance leads to lower usage.

By monitoring and maintaining the selected S-KPIs, the operator can offer best-inclass

services that satisfy users and sustain a high-quality brand image.

Other gains from good performance include lower IT environment costs, as less

effort is needed for monitoring and more time can be spent on analyzing the

implications of the results. It also enables more accurate dimensioning of IT support.

Figure 9) Good performance will boost business, Tcost will decrease and the return on

investment will be reduced.T

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

15 (17)

6 Conclusion

The performance of telecom services is an increasingly important differentiator in the

maturing telecom market. However, enhancing the ability to guarantee high user

service performance requires certain steps to be taken.

The separation of services into user and system services provides a firm level

against which to measure and monitor. A system service is user-independent and so

can be standardized. Knowing the performance of system services enables costefficient

monitoring of all services.

By identifying and monitoring a certain selection of service KPIs, the availability of

sufficient service quality for the user can be determined for all sessions and all users

on a 24-hour basis increasing the ability to guarantee user-perceived service

performance.

The careful selection of a vital few KPIs for each system service means S-KPIs will

be complete, comparable and similar in their implementation. This enables operators

and vendors to discuss perceived user performance from an agreed standpoint the

user service performance assurance baseline.

This performance baseline defines a firm level between QoS and QoE. By defining

and mapping system service level to user services enables the Quality of System

Service to be monitored and measured, along with the Quality of User Service.

Based on this, the operator can focus on the more subjective areas that determine

QoE.

7 Glossary

Accessibility How easy it is to start a service

Integrity Quality of, for example, throughput and video

Infrastructure data Alarms and performance data from network and nodes

GIS Geographical Information System

KPI Key Performance Indicator

KQI Key Quality Indicator

KPI & KQI From a set of available performance indicators, an operator

chooses a small subset of those that from his viewpoint best

represent the network behavior as KPIs or KQIs (Ref 3,4& 6).

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

16 (17)

MMS Multimedia Messaging Service

OSS Operation Support System

O&M Operation & Maintenance

Retainability The ability to keep the service going

S-KPI System Service KPI

SLA Service Level Agreement

SPI Service Performance Index

System Service The combined performance of access, core and service

domain infrastructure and related performance in the terminal

equipment

System Service The percentage of time a service fulfils the end-user needs

Availability measured by its S-KPIs

Traffic data Information from ongoing commercial traffic

User data User-perceived performance

User Service One or more system services, including the performance of the

actual used terminal

QoUS Quality of User Service

QoE Quality of Experience; the user's perceived experience of

what is being presented by a user service interface

QoS A technical concept measured within the network and network

elements. QoS parameters are standardized. Quality of

Service level is only a subset of QoE. Even if all defined traffic

QoS parameters are met, this does not guarantee a satisfied

user.

User Service performance

284 23-3106 Uen Rev A Ericsson AB 2007

Public

17 (17)

8 References

1. ITU Standard for KPI Structure, E.800

2. ITU workshop June 2006 http://www.itu.int/ITUT/

worksem/qos/200606/programme.html

3. HTUTMF SLA Management Handbook GB917-2 2005UTH

4. TMF SLA Management Handbook for KQI, KPI, and QoS Definitions, GB-

2917

5. Enhanced Telecom Operations Map (eTOM), GB921

6. ETSI Definition of Quality of Service KPI.s for end user measurement, TS

102250-2

7. Ericsson & Nokia Architecture description : Managing the performance

management data flow in mobile network management solutions

8. ETSI TR 102 479 V1.1.1 (2006-02)

9. ETSI TR 102 274 V1.1.2 (2004-01)