varrette.gforge.uni.lu · performance evaluation and modeling of saas web services in the cloud...

169
Performance Evaluation and Modeling of SaaS Web Services in the Cloud Abdallah Ali Zainelabden Abdallah Ibrahim PhD Defense / University of Luxembourg (UL) January 10 th , 2020 Dissertation Defense Committee: Chairman A-Prof. Dr. Ulrich Sorger University of Luxembourg, Luxembourg Vice-Chairman Prof. El-Ghazali Talbi INRIA, University of Lille, France Jury Member Dr. Dzmitry Kliazovich Oply Mobility, Luxembourg Ph.D supervisor Prof. Dr. Pascal Bouvry University of Luxembourg, Luxembourg Ph.D advisor Dr. Sébastien Varrette University of Luxembourg, Luxembourg 1 / 65 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Upload: others

Post on 08-May-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Performance Evaluation andModeling of SaaS Web Services in

the Cloud

Abdallah Ali Zainelabden Abdallah Ibrahim

PhD Defense / University of Luxembourg (UL)

January 10th, 2020

Dissertation Defense Committee:

Chairman A-Prof. Dr. Ulrich Sorger University of Luxembourg, Luxembourg

Vice-Chairman Prof. El-Ghazali Talbi INRIA, University of Lille, France

Jury Member Dr. Dzmitry Kliazovich Oply Mobility, Luxembourg

Ph.D supervisor Prof. Dr. Pascal Bouvry University of Luxembourg, Luxembourg

Ph.D advisor Dr. Sébastien Varrette University of Luxembourg, Luxembourg

1 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Summary

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

2 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Summary

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

3 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Introduction

MainFrame

Super Computers

PC / Network

Computing Service Application

Providers

1980s

Virtualization

Frameworks

1990s

Cloud Computing

2005s

Fog / Edge IoT / Sensors

2020s

High Performance

Computing

2007s 2010s1960s 1970s

4 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Introduction

MainFrame

Super Computers

PC / Network

Computing Service Application

Providers

1980s

Virtualization

Frameworks

1990s

Cloud Computing

2005s

Fog / Edge IoT / Sensors

2020s

High Performance

Computing

2007s 2010s1960s 1970s

Cloud Computing (CC) [Source : NIST (National Institute of Standards & Technology)]

Network access to a shared pool of configurable computing resources* which is:→ ubiquitous→ convenient→ on-demand

* e.g., networks, servers, storage, applications, and services

4 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Cloud Computing Deployment Models

Infrastructure as a Service (IaaS)

Amazon WS, Google Cloud, Microsoft Azure...

5 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Cloud Computing Deployment Models

Platform as a Service (PaaS)

Microsoft Azure, Google AppEngine, Heroku, SalesForce

5 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Cloud Computing Deployment Models

Software as a Service (SaaS)

GMail, Office 365, GoogleApps, MongoDB, ...

5 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Cloud Computing Deployment Models

Whatever as a Service (<x>aaS)

... as soon as it runs in a pay-per-use model over Cloud resources

5 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Public Cloud Market Share: SaaS

Pub

lic c

loud

mar

ket (

US

$ b

ilion

s)

0

100

200

300

400

2018 2019 2020 2021 2022

SaaS BpaaS IaaS PaaS

[GAR19] K. Costello & al. Gartner Forecasts Worldwide Public Cloud Revenue to Grow in 2020 , Gartner, 2019.

6 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Public Cloud Market Share: SaaS

Pub

lic c

loud

mar

ket (

US

$ b

ilion

s)

0

100

200

300

400

2018 2019 2020 2021 2022

SaaS BpaaS IaaS PaaS

[GAR19] K. Costello & al. Gartner Forecasts Worldwide Public Cloud Revenue to Grow in 2020 , Gartner, 2019.

6 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

SaaS Share45% in 2022

Context & Motivations

Service Level Agreement (SLA)

Negotiation Process

Cloud Services

Cloud Services Provider

(CSP)

Customers

Service Level Agreement (SLA)

A contract which defines exactly what services a Cloud Services Provider (CSP)provide

→ the required level or standard for those services [SLA09]

[SLA09] P. Patel & al. Service level agreement in cloud computing, 2009.

7 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP)

Cloud Services

Customer (CSC)

Cloud

Services

Negotiation Process

SLA

Document

Services

Levels

Services

Penalties

SLA Metrics

Services

Credits

Service Level Objectives

(SLOs)

Throughput

Response

Time

Quality

Metrics

Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer.

8 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP)

Cloud Services

Customer (CSC)

Cloud

Services

Negotiation Process

SLA

Document

Services

Levels

Services

Penalties

SLA Metrics

Services

Credits

Service Level Objectives

(SLOs)

Throughput

Response

Time

Quality

Metrics

Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer.

8 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP)

Cloud Services

Customer (CSC)

Cloud

Services

Negotiation Process

SLA

Document

Services

Levels

Services

Penalties

SLA Metrics

Services

Credits

Service Level Objectives

(SLOs)

Throughput

Response

Time

Quality

Metrics

Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer.

8 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP)

Cloud Services

Customer (CSC)

Cloud

Services

Negotiation Process

SLA

Document

Services

Levels

Services

Penalties

SLA Metrics

Services

Credits

Service Level Objectives

(SLOs)

Throughput

Response

Time

Quality

Metrics

Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer.

8 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Service Level Agreement (SLA)

Cloud Services Provider (CSP)

Cloud Services

Customer (CSC)

Cloud

Services

Negotiation Process

SLA

Document

Services

Levels

Services

Penalties

SLA Metrics

Services

Credits

Service Level Objectives

(SLOs)

Throughput

Response

Time

Quality

Metrics

Latency

[ICDECT17] S. Anithakumari & al. Negotiation and Monitoring of Service Level Agreements in Cloud Computing Services, Springer.

8 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Motivations

No Guarantee for SLOs

Pay-as-you-go

Cloud Services

Cloud Services Provider

(CSP)Customers

Problem Statement

Quality of the provided services are defined using SLAsYET No standard mechanism to verify and assure that the delivered servicessatisfy the signed SLA agreement in

→ an automatic way→ outside of Cloud Service Providers awareness

X measure accurately the Quality of Service (QoS)

9 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Motivations

No Guarantee for SLOs

Pay-as-you-go

Cloud Services

Cloud Services Provider

(CSP)Customers

Problem Statement

Quality of the provided services are defined using SLAsYET No standard mechanism to verify and assure that the delivered servicessatisfy the signed SLA agreement in

→ an automatic way→ outside of Cloud Service Providers awareness

X measure accurately the Quality of Service (QoS)X . . . without giving the chance to the CSP to change the allocated resources

9 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Motivations: SLA Violations

On simple (easy to detect) Key Performance Indicator: Downtime

Cloud Vendors Availability (%) Downtime (H) Avg.Downtime (H) Cost ($/H) Downtime cost ($)

YouTube 99.999 0.17 0.024 200 k 34 k

Cisco 99.97 5.33 0.761 200 k 1066 k

Facebook 99.951 8.5 1.214 200 k 1700 k

VMware 99.943 10 1.429 336 k 3360 k

Dropbox 99.903 17 2.429 200 k 3400 k

Twitter 99.871 22.68 3.24 200 k 4536 k

Netflix 99.863 24 3.429 200 k 4800 k

Google 99.661 59.31 8.473 300 k 17739 k

Apple 99.583 73.05 10.436 200 k 14610 k

Yahoo 99.475 92 13.143 200 k 18400 k

SalesForce 99.32 119.08 17.012 200 k 23816 k

OVH 98.963 181.63 25.947 336 k 61027 k

IBM 98.727 223 31.857 336 k 74928 k

Amazon 98.382 292.893 41.841 336 k 98411 k

Microsoft Azure 97.811 383.54 54.791 336 k 128869 k

[IWGCCR13] C. Cerin & al. Downtime statistics of current cloud solutions. IWGCCR, 2013.

10 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Motivations: SLA Violations

On simple (easy to detect) Key Performance Indicator: Downtime

Cloud Vendors Availability (%) Downtime (H) Avg.Downtime (H) Cost ($/H) Downtime cost ($)

YouTube 99.999 0.17 0.024 200 k 34 k

Cisco 99.97 5.33 0.761 200 k 1066 k

Facebook 99.951 8.5 1.214 200 k 1700 k

VMware 99.943 10 1.429 336 k 3360 k

Dropbox 99.903 17 2.429 200 k 3400 k

Twitter 99.871 22.68 3.24 200 k 4536 k

Netflix 99.863 24 3.429 200 k 4800 k

Google 99.661 59.31 8.473 300 k 17739 k

Apple 99.583 73.05 10.436 200 k 14610 k

Yahoo 99.475 92 13.143 200 k 18400 k

SalesForce 99.32 119.08 17.012 200 k 23816 k

OVH 98.963 181.63 25.947 336 k 61027 k

IBM 98.727 223 31.857 336 k 74928 k

Amazon 98.382 292.893 41.841 336 k 98411 k

Microsoft Azure 97.811 383.54 54.791 336 k 128869 k

[IWGCCR13] C. Cerin & al. Downtime statistics of current cloud solutions. IWGCCR, 2013.

10 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

SLA Violations: SaaS

[Google19] Google. G Suite Status Dashboard. 2019.

11 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

SLA Violations: SaaS

[Google19] Google. G Suite Status Dashboard. 2019.

11 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

State-of-the-Art: SaaS Performance Evaluation

Relatively few research work done:→ mostly focus on quality of software services [SP18]

→ quality models: rough sets, MCDA, prediction and fuzzy logic, Mean Opinion Score→ different attributes of software quality:

X functionality, reliability, usability, efficiency, maintainability, and portability [ISO01]

→ OR report easy to detect KPIs (Ex: downtime)

[SP18] D. Jagli & .al, A quality model for evaluating saas on the cloud computing environment, Springer, 2018[ISO01] ISO/IEC 9126-1:2001 Software engineering-Product quality-Part 1: Quality model

12 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

State-of-the-Art: SaaS Performance Evaluation

Relatively few research work done:→ mostly focus on quality of software services [SP18]

→ quality models: rough sets, MCDA, prediction and fuzzy logic, Mean Opinion Score→ different attributes of software quality:

X functionality, reliability, usability, efficiency, maintainability, and portability [ISO01]

→ OR report easy to detect KPIs (Ex: downtime)

⇒ no actual automated way to check QoS

[SP18] D. Jagli & .al, A quality model for evaluating saas on the cloud computing environment, Springer, 2018[ISO01] ISO/IEC 9126-1:2001 Software engineering-Product quality-Part 1: Quality model

12 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

State-of-the-Art: SLA Assurance

There are few works in SLA metrics monitoring/measurement:→ Based on Black-box metrics evaluation [CloudCom17]:

X CloudHarmony, Monitis, CloudWatch, CloudStatus, . . .X Test-as-a-Service (TaaS) on the cloudX other frameworks, CLOUDQUAL [TI14]

[CloudCom17] S. Wagle & .al, Service performance pattern analysis and prediction of available providers, 2017[TI14] X. Zheng & .al, Cloudqual: A quality model for cloud services, IEEE Trans. on Informatics, 2014

13 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

State-of-the-Art: SLA Assurance

There are few works in SLA metrics monitoring/measurement:→ Based on Black-box metrics evaluation [CloudCom17]:

X CloudHarmony, Monitis, CloudWatch, CloudStatus, . . .X Test-as-a-Service (TaaS) on the cloudX other frameworks, CLOUDQUAL [TI14]

⇒ no automated and standard way of measuring the SLA compliance

[CloudCom17] S. Wagle & .al, Service performance pattern analysis and prediction of available providers, 2017[TI14] X. Zheng & .al, Cloudqual: A quality model for cloud services, IEEE Trans. on Informatics, 2014

13 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:→ QoS and SLA compliance of cloud SaaS services offered→ Across several CSPs (allowing to propose a pertinent ranking between them)

14 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:→ QoS and SLA compliance of cloud SaaS services offered→ Across several CSPs (allowing to propose a pertinent ranking between them)

The framework should assess SaaS services:→ Pertinent benchmarking/monitoring involving multiple metrics using distributed agents

14 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:→ QoS and SLA compliance of cloud SaaS services offered→ Across several CSPs (allowing to propose a pertinent ranking between them)

The framework should assess SaaS services:→ Pertinent benchmarking/monitoring involving multiple metrics using distributed agents→ Automatic and stealth (i.e., obfuscated) way

X prevent CSPs to improve their results (by adapting the allocated resource) upon detection of evaluation

→ Defeat benchmarking detectionX hidden as a “normal” client behaviour

14 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Research Motivations

Ph.D. Objectives

Propose a systematic & optimized framework for evaluating:→ QoS and SLA compliance of cloud SaaS services offered→ Across several CSPs (allowing to propose a pertinent ranking between them)

The framework should assess SaaS services:→ Pertinent benchmarking/monitoring involving multiple metrics using distributed agents→ Automatic and stealth (i.e., obfuscated) way

X prevent CSPs to improve their results (by adapting the allocated resource) upon detection of evaluation

→ Defeat benchmarking detectionX hidden as a “normal” client behaviour

⇒ PRESEnCE framework

14 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Summary of Ph.D. Contributions

CSPsSaaS Cloud Web

ServicesSLAs

Analysis KPIs

Cloud Computing

Evaluating &

Monitoring

Stealth TestingMetrics Modeling

Prediction Model

for Metrics

Sensitivity

Analysis

Assurance &

Verification

SLOs / Metrics

Analysis

Probability-based Model

for detecting Breaches

QoS Analysis

MCDA-based Ranking

Service-Levels-based

Ranking

PR

ES

ENC

E

15 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Summary of Ph.D. Contributions

CSPsSaaS Cloud Web

ServicesSLAs

Analysis KPIs

Cloud Computing

Evaluating &

Monitoring

Stealth TestingMetrics Modeling

Prediction Model

for Metrics

Sensitivity

Analysis

Assurance &

Verification

SLOs / Metrics

Analysis

Probability-based Model

for detecting Breaches

QoS Analysis

MCDA-based Ranking

Service-Levels-based

Ranking

PR

ES

ENC

E

15 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Summary of Ph.D. Contributions

CSPsSaaS Cloud Web

ServicesSLAs

Analysis KPIs

Cloud Computing

Evaluating &

Monitoring

Stealth TestingMetrics Modeling

Prediction Model

for Metrics

Sensitivity

Analysis

Assurance &

Verification

SLOs / Metrics

Analysis

Probability-based Model

for detecting Breaches

QoS Analysis

MCDA-based Ranking

Service-Levels-based

Ranking

PR

ES

ENC

E

15 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Summary of Ph.D. Contributions

CSPsSaaS Cloud Web

ServicesSLAs

Analysis KPIs

Cloud Computing

Evaluating &

Monitoring

Stealth TestingMetrics Modeling

Prediction Model

for Metrics

Sensitivity

Analysis

Assurance &

Verification

SLOs / Metrics

Analysis

Probability-based Model

for detecting Breaches

QoS Analysis

MCDA-based Ranking

Service-Levels-based

Ranking

PR

ES

ENC

E

15 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Context & Motivations

Summary of Ph.D. Contributions

CSPsSaaS Cloud Web

ServicesSLAs

Analysis KPIs

Cloud Computing

Evaluating &

Monitoring

Stealth TestingMetrics Modeling

Prediction Model

for Metrics

Sensitivity

Analysis

Assurance &

Verification

SLOs / Metrics

Analysis

Probability-based Model

for detecting Breaches

QoS Analysis

MCDA-based Ranking

Service-Levels-based

Ranking

PR

ES

ENC

E

15 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

16 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Proposed Framework

PRESEnCE Framework Objective

Evaluate the QoS and SLA compliance of Web Services offered→ And across several Cloud Service Providers (CSPs).

MethodologyQuantify in a fair & stealth way the SaaS WS performance

→ including scalability of the delivered Web Services.

Assess the claimed SLA and the corresponding QoS→ using a set of relevant performance metrics (response time).

Provide a multi-objective analysis of the gathered performance metrics→ to be able to classify cloud brokers.

17 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

WS Performance

Evaluation

Stealth module dynamic load adaptation

Modeling modulepredictive monitoring

SLA checker modulevirtual QoS aggregator

Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached,

MongoDB, PostgreSQL etc.

Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1

Client cA2

Client cAn

Client cB1

Client cB2

Client cBm

[Distributed] PRESEnCE Client c’ (Auditor)

18 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

WS Performance

Evaluation

Stealth module dynamic load adaptation

Modeling modulepredictive monitoring

SLA checker modulevirtual QoS aggregator

Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached,

MongoDB, PostgreSQL etc.

Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1

Client cA2

Client cAn

Client cB1

Client cB2

Client cBm

[Distributed] PRESEnCE Client c’ (Auditor)

18 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

WS Performance

Evaluation

Stealth module dynamic load adaptation

Modeling modulepredictive monitoring

SLA checker modulevirtual QoS aggregator

Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached,

MongoDB, PostgreSQL etc.

Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1

Client cA2

Client cAn

Client cB1

Client cB2

Client cBm

[Distributed] PRESEnCE Client c’ (Auditor)

18 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

WS Performance

Evaluation

Stealth module dynamic load adaptation

Modeling modulepredictive monitoring

SLA checker modulevirtual QoS aggregator

Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached,

MongoDB, PostgreSQL etc.

Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1

Client cA2

Client cAn

Client cB1

Client cB2

Client cBm

[Distributed] PRESEnCE Client c’ (Auditor)

18 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

WS Performance

Evaluation

Stealth module dynamic load adaptation

Modeling modulepredictive monitoring

SLA checker modulevirtual QoS aggregator

Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached,

MongoDB, PostgreSQL etc.

Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1

Client cA2

Client cAn

Client cB1

Client cB2

Client cBm

[Distributed] PRESEnCE Client c’ (Auditor)

18 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

19 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module→ monitoring & modeling the Cloud services performance metrics

19 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module→ monitoring & modeling the Cloud services performance metrics

Stealth Module→ providing obfuscated and optimized benchmarking scenarios

19 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Framework

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module→ monitoring & modeling the Cloud services performance metrics

Stealth Module→ providing obfuscated and optimized benchmarking scenarios

SLA checker Module→ assessing & assuring SLA metrics

19 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

20 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Modeling Module

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

PRESEnCE modeling module objectives→ Analysis of SaaS Metrics→ Evaluating & monitoring SaaS Web Services→ Collecting Data for the Metrics→ Modeling the performance metrics

21 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Cloud Critical KPIs

KPIs MetricsAvailability Response time, Up time, Down timeScalability Avg. assigned resources, Avg. number of users, CapacityReliability Accuracy of Service, Fault Tolerance, MaturityEfficiency Utilization of Resource, Ratio of waiting timeReusability Readability, Publicity, Coverage of variabilityComposability Service Modularity, Service interoperabilityAdaptability Completeness of Variant Set, Coverage of VariabilityUsability Operability, Attractiveness, LearnabilityElasticity Suspend Time, Delete Time, Provision TimeNetwork andCommunication

Packet Loss Frequency, Connection Error Rate,Throughput, Latency

Security Security Standards, Data Integrity, Sensitivity, ConfidentialityCost Total Cost, FLOP Cost (cent /FLOP, GFLOP )

[CC12] S. Sinung & al. Performance measurement of cloud computing services, 2012.

22 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Cloud Critical KPIs

KPIs MetricsAvailability Response time, Up time, Down timeScalability Avg. assigned resources, Avg. number of users, CapacityReliability Accuracy of Service, Fault Tolerance, MaturityEfficiency Utilization of Resource, Ratio of waiting timeReusability Readability, Publicity, Coverage of variabilityComposability Service Modularity, Service interoperabilityAdaptability Completeness of Variant Set, Coverage of VariabilityUsability Operability, Attractiveness, LearnabilityElasticity Suspend Time, Delete Time, Provision TimeNetwork andCommunication

Packet Loss Frequency, Connection Error Rate,Throughput, Latency

Security Security Standards, Data Integrity, Sensitivity, ConfidentialityCost Total Cost, FLOP Cost (cent /FLOP, GFLOP )

[CC12] S. Sinung & al. Performance measurement of cloud computing services, 2012.[TR10] Guiding Metrics, The cloud service industry’s 10 most critical metrics, 2019.

22 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Agents

WS Performance

Evaluation

Stealth module dynamic load adaptation

Modeling modulepredictive monitoring

SLA checker modulevirtual QoS aggregator

Agent / metric 1 Agent / metric 2 Agent / metric k

Benchmark Tool Version Targeted SaaS Web ServicesYCSB 0.12.0 Redis, MongoDB, Memcached, DynamoDB, ..etcMemtire-Bench 1.2.8 Redis, MemcachedRedis-Bench 2.4.2 RedisTwitter RPC-Perf 2.0.3-pre Redis, Memcached, ApachePgBench 9.4.12 Postgresql, MySQl, SQLServer, Oracle DBApache AB 2.3 Apache, Nginx, JexusHTTP Load 1 Apache, Nginx, JexusIperf v1, v3 Iperf Server

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

23 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB

Memtire-Bench

Redis-Bench

Twitter RPCPerf

PgBench

Apache AB

HTTP Load

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench

Redis-Bench

Twitter RPCPerf

PgBench

Apache AB

HTTP Load

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench

Twitter RPCPerf

PgBench

Apache AB

HTTP Load

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench X

Twitter RPCPerf

PgBench

Apache AB

HTTP Load

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench X

Twitter RPCPerf X X X X

PgBench

Apache AB

HTTP Load

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench X

Twitter RPCPerf X X X X

PgBench X X

Apache AB

HTTP Load

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench X

Twitter RPCPerf X X X X

PgBench X X

Apache AB X X X

HTTP Load

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench X

Twitter RPCPerf X X X X

PgBench X X

Apache AB X X X

HTTP Load X X X

Iperf

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi OUTPUTs (Measured Metrics)

Para

mete

rs

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

Ben

chm

ark

Bi

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench X

Twitter RPCPerf X X X X

PgBench X X

Apache AB X X X

HTTP Load X X X

Iperf X X

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi INPUTs Bi OUTPUTs (Measured Metrics)

Para

mete

rs

#T

ransa

ctions

#R

equests

#O

pera

tions

#R

ecord

s

#Fetch

es

#P

arallel

Clien

ts

#P

ipes

#T

hrea

ds

Work

load

Size

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

HitsB

ench

mar

kB

i

YCSB X X X X X X

Memtire-Bench X X X X X

Redis-Bench X

Twitter RPCPerf X X X X

PgBench X X

Apache AB X X X

HTTP Load X X X

Iperf X X

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Identified Performance Metrics

Bi INPUTs Bi OUTPUTs (Measured Metrics)

Para

mete

rs

#T

ransa

ctions

#R

equests

#O

pera

tions

#R

ecord

s

#Fetch

es

#P

arallel

Clien

ts

#P

ipes

#T

hrea

ds

Work

load

Size

Thro

ughput

Laten

cy

Rea

dLaten

cy

Update

Laten

cy

Clea

nU

pLaten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

HitsB

ench

mar

kB

i

YCSB X X X X X X X X X X

Memtire-Bench X X X X X X X X X

Redis-Bench X X X X X

Twitter RPCPerf X X X X X X

PgBench X X X X X

Apache AB X X X X X

HTTP Load X X X X X

Iperf X X X

Performance Metrics Coverage:

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

24 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Deployed Web Services

SaaS Services Type Version Used by

Redis NoSQL Database 2.8.17GitHub, Twitter,Pinterest

MongoDB NoSQL Database 3.4Google, Facebook,Cisco, ebay, Uber

Memcached NoSQL Database 1.5.0Amazone, Netflix,Instagram, Slack, Dropbox

PostrgreSQL SQL Database 9.4Nokia, BMW, Netflix,Skybe, Apple

Apache HTTP 2.2.22.13Linkedin, Slack,Accenture

Iperf server Network V1, V3 –

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

25 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Monitoring

Cloud Services

CSC 1

PRESENCE Auditor

Normal Trace

MonitoringMetrics

Evaluations

CSC 2

CSC n

CSP n

CSP 1

Normal Trace

(Workload)

Benchmarking Scenarios

PRESENCE

Agents

Deployed

Services

26 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

0 2000 4000 6000 8000 10000

02

00

04

00

06

00

08

00

01

00

00

Number of Operations

Th

rou

gh

pu

t(o

ps/s

ec)

Servers(Throughput):

Redis

MongoDB

Memcached

0 2000 4000 6000 8000 10000

Number of Records

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

27 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

0 2000 4000 6000 8000 10000

01

00

00

20

00

03

00

00

40

00

05

00

00

Number of Operations

Up

da

te L

ate

ncy(u

s)

Servers(Update Latency):

Redis

MongoDB

Memcached

0 2000 4000 6000 8000 10000

Number of Records

0 2000 4000 6000 8000 10000

02

00

00

40

00

06

00

00

Number of Operations

Re

ad

La

ten

cy(u

s)

Servers(Read Latency):

Redis

MongoDB

Memcached

0 2000 4000 6000 8000 10000

Number of Records

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

27 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

0 50000 100000 150000 200000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Fetches

No

rma

lize

d T

hro

ug

hp

ut

(Fe

tch

es/s

ec)

0.0

0.2

0.4

0.6

0.8

1.0

300 650 1000

Number of Parallel Clients

No

rma

lize

d L

ate

ncy

HTTP LOAD

Throughput

Latency

200 400 600 800 1000

20

40

60

80

10

01

20

Number of Parallel Clients

Ave

rag

e L

ate

ncy (

mse

c)

20

40

60

80

10

01

20

30000 120000 190000

Number of Fetches

Ave

rag

e L

ate

ncy C

on

ne

ctio

n (

mse

c)

HTTP LOAD:

Latency (Avg (msec))

Connection Latency (AVG (ms))

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

27 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Monitoring Results

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Transactions per Client

No

rma

lize

d T

PS

0.0

0.2

0.4

0.6

0.8

1.0

No

rma

lize

d R

esp

on

se

Tim

e (

La

ten

cy)

20 50 80

Number of Parallel Clients

Pgbench

TPS

Response Time

[MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018.

27 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Response Time

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Performance Modeling

Arena: Input AnalyserAppropriate Distribution

Cloud Services

CSC 1

PRESENCE

Auditor

Normal Trace

Monitoring

Metrics Evaluations

CSC 2

CSC n

CSP n

CSP 1

Normal Trace

(Workload)

PRESENCE

AgentsDeployed

Services

Benchmarking

ScenariosCollecting

Data

Modeling

28 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Performance Modeling

Arena: Input AnalyserAppropriate Distribution

Cloud Services

CSC 1

PRESENCE

Auditor

Normal Trace

Monitoring

Metrics Evaluations

CSC 2

CSC n

CSP n

CSP 1

Normal Trace

(Workload)

PRESENCE

AgentsDeployed

Services

Benchmarking

ScenariosCollecting

Data

Modeling

28 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Models Validation

Necessity to validate the PRESEnCE models generated from:→ The monitoring data from PRESEnCE agents→ Generated data from the obtained models

NORMALITY TEST

(Kologorov-Smiron test)

NORMAL VARIABLES

(mean comparisons, parametric tests)

NON-NORMAL VARIABLES

(median comparisons, non-parametric tests)

Student

t-test

ANOVA

(Analysis of Variance)Wilcoxon test Friedman test

True False

2 data 2 data > 2 data> 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013

29 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Models Validation

Necessity to validate the PRESEnCE models generated from:→ The monitoring data from PRESEnCE agents→ Generated data from the obtained models

NORMALITY TEST

(Kologorov-Smiron test)

NORMAL VARIABLES

(mean comparisons, parametric tests)

NON-NORMAL VARIABLES

(median comparisons, non-parametric tests)

Student

t-test

ANOVA

(Analysis of Variance)Wilcoxon test Friedman test

True False

2 data 2 data > 2 data> 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013

Statistically significant: Confidence level > 95% , (p-value < 0.05)

29 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Modeling Results

Ex: Redis SaaS Web serviceMetric Distribution Model Expression

Throughput Beta

−0.001 + 1 ∗ BETA(3.63, 3.09)

whereBETA(β, α)β = 3.63α = 3.09Offset = −0.001

f (x) =

xβ−1(1−x)α−1

B(β,α)for 0 < x < 1

0 otherwise

where β is the complete beta function given by

B(β, α) =∫ 1

0tβ−1(1 − t)α−1dt

Latency Read Gamma

−0.001 + GAMM(0.0846, 2.39)

whereGAMM(β, α)β = 0.0846α = 2.39Offset = −0.001

f (x) =

β−αxα−1e−

Γ(α)for x > 0

0 otherwisewhere Γ is the complete gamma function given by

Γ(α) =∫ inf

0tα−1e−1dt

Latency Update Erlang

−0.001 + ERLA(0.0733, 3)

whereERLA(β, k)k = 3β = 0.0733Offset = −0.001

f (x) =

β−k xk−1e−

(k−1)!for x > 0

0 otherwise

[CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018.

30 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Modeling Results

Ex: MongoDB SaaS Web serviceMetric Distribution Model Expression

Throughput Beta

−0.001 + 1 ∗ BETA(3.65, 2.11)

whereBETA(β, α)β = 3.65α = 2.11Offset = −0.001

f (x) =

xβ−1(1−x)α−1

B(β,α)for 0 < x < 1

0 otherwise

where β is the complete beta function given by

B(β, α) =∫ 1

0tβ−1(1 − t)α−1dt

Latency Read Beta

−0.001 + 1 ∗ BETA(1.6, 2.48)

whereBETA(β, α)β = 1.6α = 2.48Offset = −0.001

f (x) =

xβ−1(1−x)α−1

B(β,α)for 0 < x < 1

0 otherwise

where β is the complete beta function given by

B(β, α) =∫ 1

0tβ−1(1 − t)α−1dt

Latency Update Erlang

−0.001 + ERLA(0.0902, 2)

whereERLA(β, k)k = 2β = 0.0902Offset = −0.001

f (x) =

β−k xk−1e−

(k−1)!for x > 0

0 otherwise

[CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018.

30 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE WS Modeling Results

Bi OUTPUTs (Measured Metrics)

Thro

ughput

Laten

cy

Rea

dL

aten

cy

Up

date

Laten

cy

Clea

nU

pL

aten

cy

Tra

nsfer

Rate

Resp

onse

Tim

e

Miss

Hits

SaaS WS Performance Models Summary

19 models were generated:→ represent the performance metrics for the SaaS Web Service

15 out of 19 models are proved accurate→ i.e., 78.9% of the analyzed models have Confidence level > 95%

[CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018.

31 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Modeling Module Summary

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

Analysis

Performance Metrics

Evaluate &

Monitoring Metrics

Collecting Data for

the Metrics

Generate

Distribution Models

32 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Modeling Module Summary

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Modeling Module

monitoring/modeling

Analysis

Performance Metrics

Evaluate &

Monitoring Metrics

Collecting Data for

the Metrics

Generate

Distribution Models

32 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Agent / metric 1 Agent / metric 2 Agent / metric k

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

33 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Stealth Module

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE stealth module objectivesProvide benchmark scenarios which ensure :

→ accurate and stealth (i.e., obfuscated) testingX CSP should not adapt the allocated resource. Ex: to improve evaluation results

34 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Stealth Module

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE stealth module objectivesProvide benchmark scenarios which ensure :

→ accurate and stealth (i.e., obfuscated) testingX CSP should not adapt the allocated resource. Ex: to improve evaluation results

→ defeating potential benchmarking detectionX hidden as a “normal” client behaviour

34 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: Stealth Module

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE stealth module objectivesProvide benchmark scenarios which ensure :

→ accurate and stealth (i.e., obfuscated) testingX CSP should not adapt the allocated resource. Ex: to improve evaluation results

→ defeating potential benchmarking detectionX hidden as a “normal” client behaviour

→ Exploiting PRESEnCE models previously generated

34 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Stealth Module Overview

Under Evaluation

Arena: Input AnalyserAppropriate Distribution

Cloud Services

CSC 1

PRESENCE Auditor

Testing Model (1)

Monitoring

Metrics Evaluations

CSC 2

CSC n

CSP n

CSP 1

Normal Trace

(Workload)

PRESENCE Agents

Deployed Services

Benchmarking Scenarios

Collecting

DataModeling

Stealth Module

Testing Model (2)

Testing Model (n)

ORACLE

Expected Normal

Trace

Testing Scenario

Trace

Not Stealth, you

cannot Test

Testing Scenario

Distinguishable

NO

35 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Stealth Module Overview

Under Evaluation

Arena: Input AnalyserAppropriate Distribution

Cloud Services

CSC 1

PRESENCE Auditor

Testing Model (1)

Monitoring

Metrics Evaluations

CSC 2

CSC n

CSP n

CSP 1

Normal Trace

(Workload)

PRESENCE Agents

Deployed Services

Benchmarking Scenarios

Collecting

DataModeling

Stealth Module

Testing Model (2)

Testing Model (n)

ORACLE

Expected Normal

Trace

Testing Scenario

Trace

YESIt's stealth,

you can Test

35 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

The Stealth Problem

Given an [estimated] aggregated SaaS customer behaviour→ Find the best benchmarking scenario matching this behaviour

X time-sequence of carefully selected benchmarksX adaptation/optimisation of input parameters

36 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

The Stealth Problem

Given an [estimated] aggregated SaaS customer behaviour→ Find the best benchmarking scenario matching this behaviour

X time-sequence of carefully selected benchmarksX adaptation/optimisation of input parameters

Ex: A possible solution (benchmarking scenario)→ based on benchmarks models Bi , for time period [T0 = 5, Tend = 340]

Time starttstart

Time endtend

Benchmark

Bi or Bi

Inputsparameters

5 120 Bench1 X1

45 220 Bench2 X2

130 280 Bench3 X3

190 340 Bench4 X4

36 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

0 50 100 150 200 250

20

00

30

00

40

00

50

00

60

00

Starting Time

Th

rou

gh

pu

t

100 150 200 250 300 350

Termination Time

Bench1{X1}

Bench2{X2} Bench3{X3} Bench4{X4}

Normal Trace

Testing Trace

37 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

For a given estimated benchmark Bi

→ Find optimized input parameters X ∗i minimizing RSS distance

X Obj: defeat ORACLE detection scheme

Time starttstart

Time endtend

Benchmark

Bi or Bi

Inputsparameters

5 120 Bench1 X1 → X∗

1

45 220 Bench2 X1 → X∗

2

130 280 Bench3 X1 → X∗

3

190 340 Bench4 X1 → X∗

4

38 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

0 50 100 150 200 250

20

00

30

00

40

00

50

00

60

00

Starting Time

Th

rou

gh

pu

t

100 150 200 250 300 350

Termination Time

Minimizing

Minimizing

Bench1{X1}

Bench2{X2} Bench3{X3} Bench4{X4}

Normal Trace

Testing Trace

Distance

Minimization

39 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Illustration

0 50 100 150 200 250

20

00

30

00

40

00

50

00

60

00

Starting Time

Th

rou

gh

pu

t

100 150 200 250 300 350

Termination Time

Minimized

Minimized

Bench1{X1*}

Bench2{X2*}

Bench3{X3*} Bench4{X4

*}

Normal Trace

Distance

Minimization

Optimized Testing Trace

39 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Summary

Multi-layer optimisation

Optimizing Benchmarking scenario→ finding appropriate parameters for each benchmark Bi

X minimizing the RSS distanceX underlying detection heuristic of the Oracle

→ for each ∆t:X find the best estimated benchmark Bi

→ for the global time period :X derive an optimized benchmarking scenarioX optimized sequence of benchmarks, incl.

input parameters, start & end time

Oracle

IF RSS <

Threshold

Calculate Distance

RSS between Normal

& Benchmarks

Traces

Emulating the CSP View

Expected

Normal Usage

Model

PRESENCE

Benchmarks

Yes

No

YES

NO

PRESENCE

Non-Distinguishable

Optimize the

Distance

40 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Resolution

Solving

Approaches

Exact Methods Metaheuristics

Scalability Issue

Best fitApproximate

fitScalable

[Wiley09] EG Talbi, Meta-heuristics: from design to implementation

41 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Problem Resolution

Solving

Approaches

Exact Methods Metaheuristics

Scalability Issue

Best fitApproximate

fitScalable

[Wiley09] EG Talbi, Meta-heuristics: from design to implementation

41 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Proposed approach

→ Genetic Algorithm (GA)→ Hybrid Algorithm (GA + ML)

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Optimisation Model

Stealth problem objective

For each time period and each Bi→ Optimize set of inputs Xi

→ Obj: defeat oracle detection

Optimize benchmark set over time {Bi}t→ Note: Benchs may overlap→ Yet without loss of generality:

X no overlap between Benchmarks

minθ∈Θ

maxi

∆(Y, Y )

where ∆(Y, Y ) =n

i,k

∣Y − Y∣

∣ (1)

s.t. minθ∈Θ

z

z ≥∑

∣y1 − y1(Θ)∣

z ≥∑

∣y2 − y2(Θ)∣

z ≥∑

∣y3 − y3(Θ)∣

...

z ≥∑

∣yi − yi(Θ)∣

(2)

i ∈ {1, 2, 3, ..., n} (3)

where yi(Θ) → Prediction Model

42 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Application: FIFA Web Services

Deployed during one of the most popular worldwide event→ squad and venue information, live matches etc.

[NET] A. Martin & al. Workload Characterization of the 1998 World Cup Web Site. IEEE Network.

43 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Application: FIFA Web Services

Deployed during one of the most popular worldwide event→ squad and venue information, live matches etc.

43 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Application: FIFA Web Services

Deployed during one of the most popular worldwide event→ squad and venue information, live matches etc.

43 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Experimental Setup

Application of PRESEnCE stealth module against FIFA WS traces→ comparison of the two proposed approaches (GA and Hybrid)

Configurations [1, 2, 3] Configurations [4, 5, 6]Expected normal trace FIFA FIFANumber of generations 1000 10000Population size [20, 50, 100] [20, 50, 100]Number of evaluations [50, 20, 10] [500, 200, 100]Selection process Bi-Tournament Bi-TournamentCrossover operator 2-point crossover 2-point crossoverCrossover rate 0.8 0.8Mutation operator uniform uniformMutation rate 0.01 0.01Number of executions 30 30

44 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Experimental Setup

Application of PRESEnCE stealth module against FIFA WS traces→ comparison of the two proposed approaches (GA and Hybrid)

Configurations [1, 2, 3] Configurations [4, 5, 6]Expected normal trace FIFA FIFANumber of generations 1000 10000Population size [20, 50, 100] [20, 50, 100]Number of evaluations [50, 20, 10] [500, 200, 100]Selection process Bi-Tournament Bi-TournamentCrossover operator 2-point crossover 2-point crossoverCrossover rate 0.8 0.8Mutation operator uniform uniformMutation rate 0.01 0.01Number of executions 30 30

Performance Indicator for PRESEnCE stealth module⇒ Convergence

44 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Results - Convergence

Config 1

GA

Config 1

Hybrid

Config 2

GA

Config 2

Hybrid

Config 3

GA

Config 3

Hybrid

Config 4

GA

Config 4

Hybrid

Config 5

GA

Config 5

Hybrid

Config 6

GA

Config 6

Hybrid

Conve

rgence (

Resid

ual)

20000

22000

24000

26000

28000

30000

Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500

STD

StdErr

95% Confidence

Interval

Ev −> Evaluations

Ex −> Executions

45 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Results - Convergence

Config 1

GA

Config 1

Hybrid

Config 2

GA

Config 2

Hybrid

Config 3

GA

Config 3

Hybrid

Config 4

GA

Config 4

Hybrid

Config 5

GA

Config 5

Hybrid

Config 6

GA

Config 6

Hybrid

Conve

rgence (

Resid

ual)

20000

22000

24000

26000

28000

30000

Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500

STD

StdErr

95% Confidence

Interval

Ev −> Evaluations

Ex −> Executions

45 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Lower Convergence

is better

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Results - Convergence

Config 1

GA

Config 1

Hybrid

Config 2

GA

Config 2

Hybrid

Config 3

GA

Config 3

Hybrid

Config 4

GA

Config 4

Hybrid

Config 5

GA

Config 5

Hybrid

Config 6

GA

Config 6

Hybrid

Conve

rgence (

Resid

ual)

20000

22000

24000

26000

28000

30000

Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500

STD

StdErr

95% Confidence

Interval

Ev −> Evaluations

Ex −> Executions

Config 1

GA

Config 1

Hybrid

Config 2

GA

Config 2

Hybrid

Config 3

GA

Config 3

Hybrid

Config 4

GA

Config 4

Hybrid

Config 5

GA

Config 5

Hybrid

Config 6

GA

Config 6

Hybrid

Conve

rgence (

Resid

ual)

20000

22000

24000

26000

28000

30000

Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500

STD

StdErr

95% Confidence

Interval

Ev −> Evaluations

Ex −> Executions

ORACLE Threshold

45 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Lower Convergence

is better

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

GA vs. Hybrid

Necessity to validate the produced benchmarking scenarios→ Stealth results from PRESEnCE GA and Hybrid approach

NORMALITY TEST

(Kologorov-Smiron test)

NORMAL VARIABLES

(mean comparisons, parametric tests)

NON-NORMAL VARIABLES

(median comparisons, non-parametric tests)

Student

t-test

ANOVA

(Analysis of Variance)Wilcoxon test Friedman test

True False

2 data 2 data > 2 data> 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013

46 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

GA vs. Hybrid

Necessity to validate the produced benchmarking scenarios→ Stealth results from PRESEnCE GA and Hybrid approach

NORMALITY TEST

(Kologorov-Smiron test)

NORMAL VARIABLES

(mean comparisons, parametric tests)

NON-NORMAL VARIABLES

(median comparisons, non-parametric tests)

Student

t-test

ANOVA

(Analysis of Variance)Wilcoxon test Friedman test

True False

2 data 2 data > 2 data> 2 data

DATA SETS

[ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013

Statistically significant: Confidence level > 99% , (p-value < 0.01)

46 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Compare: GA & Hybrid ApproachGA Hybrid

ConfigurationsMean Std Mean Std

p-value

Configuration 1 29043.1 276.6091 25619.51 272.6941 4.114e − 05Configuration 2 29306.89 167.5696 26643.73 332.9266 1.455e − 07Configuration 3 30021.77 255.2807 25818.35 293.3159 2.2e − 16Configuration 4 28675.97 94.29658 20898.48 83.75769 2.2e − 16Configuration 5 29044.13 146.8183 25619.51 272.6941 2.2e − 16Configuration 6 29095.63 314.8841 26733.43 184.4088 2.2e − 16

47 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Compare: GA & Hybrid ApproachGA Hybrid

ConfigurationsMean Std Mean Std

p-value

Configuration 1 29043.1 276.6091 25619.51 272.6941 4.114e − 05Configuration 2 29306.89 167.5696 26643.73 332.9266 1.455e − 07Configuration 3 30021.77 255.2807 25818.35 293.3159 2.2e − 16Configuration 4 28675.97 94.29658 20898.48 83.75769 2.2e − 16Configuration 5 29044.13 146.8183 25619.51 272.6941 2.2e − 16Configuration 6 29095.63 314.8841 26733.43 184.4088 2.2e − 16

Config 1

GA

Config 1

R+

GA

Config 2

GA

Config 2

R+

GA

Config 3

GA

Config 3

R+

G

26000

27000

28000

29000

30000

31000

Conve

rgence (

Resid

ual)

●●●●

●●●●●●

●●●●●●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

Config 4

GA

Config 4

R+

GA

Config 5

GA

Config 5

R+

GA

Config 6

GA

Config 6

R+

GA

22000

24000

26000

28000

30000

Conve

rgence (

Resid

ual)

47 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Compare: GA & Hybrid ApproachGA Hybrid

ConfigurationsMean Std Mean Std

p-value

Configuration 1 29043.1 276.6091 25619.51 272.6941 4.114e − 05Configuration 2 29306.89 167.5696 26643.73 332.9266 1.455e − 07Configuration 3 30021.77 255.2807 25818.35 293.3159 2.2e − 16Configuration 4 28675.97 94.29658 20898.48 83.75769 2.2e − 16Configuration 5 29044.13 146.8183 25619.51 272.6941 2.2e − 16Configuration 6 29095.63 314.8841 26733.43 184.4088 2.2e − 16

Config 1

GA

Config 1

R+

GA

Config 2

GA

Config 2

R+

GA

Config 3

GA

Config 3

R+

G

26000

27000

28000

29000

30000

31000

Conve

rgence (

Resid

ual)

●●●●

●●●●●●

●●●●●●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

Config 4

GA

Config 4

R+

GA

Config 5

GA

Config 5

R+

GA

Config 6

GA

Config 6

R+

GA

22000

24000

26000

28000

30000

Conve

rgence (

Resid

ual)

47 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Module Summary

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESENCE

Testing Trace

Blackbox

Optimization

Stealth Testing Expected

Normal Trace

ORACLE

NO

YES

New

Scenario

48 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE Stealth Module Summary

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

PRESENCE

Testing Trace

Blackbox

Optimization

Stealth Testing Expected

Normal Trace

ORACLE

NO

YES

New

Scenario

48 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Summary

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

49 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

PRESEnCE: SLA checker Module

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE SLA checker module objectives→ Analysis the SLOs and QoS metrics→ SLA verification & assurance→ Services-levels-based ranking for the CSPs

50 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Service Level Objectives (SLO)

[UCC14] S. Wagle & .al, SLA assured brokering (SAB) and CSP certification in cloud computing, UCC, 2014[CCGrid16] A. Ibrahim & .al, SLA Assurance between Cloud Services Providers and Cloud Customers, IEEE CCGrid, 2016

51 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

Service Level Objectives (SLO)

[UCC14] S. Wagle & .al, SLA assured brokering (SAB) and CSP certification in cloud computing, UCC, 2014[CCGrid16] A. Ibrahim & .al, SLA Assurance between Cloud Services Providers and Cloud Customers, IEEE CCGrid, 2016

51 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Assurance

Under Evaluation

Modeling

Cloud Services

CSC 1

PRESENCE Auditor

MonitoringMetrics

Evaluations

CSC 2

CSC n

CSP n

CSP 1

Normal Trace (Workload)

Deployed

Services

Testing

Model (2)

ORACLETesting

Model (1)

Testing

Model (n) Stealth Testing

SLAs

Real Quality Metrics

QoS

Assurance

[CLOUD16] A. Ibrahim & .al, On SLA Assurance in Cloud Computing Data Centers, IEEE CLOUD, 2016[CCGrid16] A. Ibrahim & .al, SLA Assurance between Cloud Services Providers and Cloud Customers, IEEE CCGrid, 2016

52 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

Probability-based model for detecting SLA Breaches

PRESEnCE monitoring for the SLA Metrics→ Identifying the possibility of breaches:

X Ex: read latency > average read latencyX Ex: throughput < average throughput

PRESEnCE modeling for the SLA metrics→ Models are used to find the probability P(x) of a breach

If the probability of a breach P(x) > Threshold :→ Assumed SLA violation

X legitimate request for penalization/compensation from CSPX PRESEnCE allows for argued complain

53 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

●●

●●●

●●

●●●

●●●●●●

●●

●●

●●●

●●●

●●

●●●

●●●

●●●

●●●●●●●

●●

●●●●●

●●●●

●●

●●●

●●

●●●●

●●

●●●●

●●

●●●●

●●●

●●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

La

ten

cy (

Sta

nd

ard

ize

d)

Redis

Latency

●●●●●●●

●●●●●●●

●●

●●●

●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●●

●●●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●●●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

Th

rou

gh

pu

t (S

tan

da

rdiz

ed

)

MongoDB

Throughput

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017

54 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

●●

●●●●

●●

●●

●●●●●●●●

●●

●●●●

●●

●●●

●●●●●●●●●●●●

●●

●●●

●●

●●●●●

●●

●●●●●●●●

●●

●●

●●●

●●

●●

●●●

●●●●

●●●●

●●

●●●

●●

●●●

●●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●

●●●

●●●●●

●●

●●●●●●●

●●●●

●●●●

●●●●

●●●●

●●

●●

●●

●●

●●●●●●

●●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●●●●●

●●●●●●

●●●

●●

●●●

●●

●●

●●●

●●

●●●

●●●●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

Re

ad

La

ten

cy (

Sta

nd

ard

ize

d)

Memcached

Latency

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017

54 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

●●

●●●●

●●

●●

●●●●●●●●

●●

●●●●

●●

●●●

●●●●●●●●●●●●

●●

●●●

●●

●●●●●

●●

●●●●●●●●

●●

●●

●●●

●●

●●

●●●

●●●●

●●●●

●●

●●●

●●

●●●

●●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●

●●●

●●●●●

●●

●●●●●●●

●●●●

●●●●

●●●●

●●●●

●●

●●

●●

●●

●●●●●●

●●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●●●●●

●●●●●●

●●●

●●

●●●

●●

●●

●●●

●●

●●●

●●●●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

Re

ad

La

ten

cy (

Sta

nd

ard

ize

d)

Memcached

Latency

●●

●●●●

●●

●●

●●●●●●●●

●●

●●●●

●●

●●●

●●●●●●●●●●●●

●●

●●●

●●

●●●●●

●●

●●●●●●●●

●●

●●

●●●

●●

●●

●●●

●●●●

●●●●

●●

●●●

●●

●●●

●●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●

●●●

●●●●●

●●

●●●●●●●

●●●●

●●●●

●●●●

●●●●

●●

●●

●●

●●

●●●●●●

●●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●●●●●

●●●●●●

●●●

●●

●●●

●●

●●

●●●

●●

●●●

●●●●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

Re

ad

La

ten

cy (

Sta

nd

ard

ize

d)

●●

●●

●●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

Memcached

Read Latency

Breach of Contract

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017

54 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Breaches

●●

●●●

●●

●●●

●●●●●●

●●

●●

●●●

●●●

●●

●●●

●●●

●●●

●●●●●●●

●●

●●●●●

●●●●

●●

●●●

●●

●●●●

●●

●●●●

●●

●●●●

●●●

●●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

La

ten

cy (

Sta

nd

ard

ize

d)

●●

●●

●●●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

Redis

Latency

Breach of Contract

●●●●●●●

●●●●●●●

●●

●●●

●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●●

●●●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●●●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

Th

rou

gh

pu

t (S

tan

da

rdiz

ed

)

●●

●●●●

●●●●

●●

●●●●

●●

●●●

●●

●●●

●●●

MongoDB

Throughput

Breach of Contract

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017

54 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Penalization

●●

●●

●●●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

Re

ad

La

ten

cy (

Sta

nd

ard

ize

d)

●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●

●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

Redis

Breach in Latency

Probability of Breach

Penalization ●

●●

●●

●●

●●

● ●

●●●

●●

●●

6000 7000 8000 9000 10000

0.2

0.3

0.4

0.5

0.6

Number of Operations

Th

rou

gh

pu

t (S

tan

da

rdiz

ed

)

●●

●●

● ●

●●●

●●

●●

●●

●●●

●●

MongoDB

Breach in Throughput

Probability of Breach

Penalization

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017

54 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Penalization

●●

●●

●●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

La

ten

cy (

Sta

nd

ard

ize

d)

● ●● ●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●

●●●●

Memcached

Breach in Latency

Probability of Breach

Penalization

55 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

SLA Penalization

●●

●●

●●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 2000 4000 6000 8000 10000

0.0

0.2

0.4

0.6

0.8

1.0

Number of Operations

La

ten

cy (

Sta

nd

ard

ize

d)

● ●● ●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●

●●●●

Memcached

Breach in Latency

Probability of Breach

Penalization

Probability-based model for detectingSLA Breaches

Penalization has issued for :→ Redis Web Services→ MongoDB Web Services

No penalization has issued for :→ Memcached Web services

[HONIT-ICT17] A. Ibrahim & .al, Law-as-a-service (LaaS): Enabling legal protection over a blockchain network, IEEE HONET-ICT 2017

55 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

CSPs Ranking

PRESEnCE: Services-level-based Ranking

Use QoS criterion to rank the Cloud Service Providers (CSPs)Based on the assured SLAs vs. PRESEnCE-assessed SLAs:

→ comparative analysis between CSPs→ allowed by PRESEnCE SaaS performance evaluation

Modeling

CSC 1

PRESENCE

Auditor

Monitoring

CSC 2

CSC n

CSP 1

Normal Trace (Workload)

ORACLE

Stealth

SLAs

QoS

Assurance

CSP n

QoS-based

Ranking

CSP n

CSP 1

Deployed

Services

Under Evaluation

Test

Assure

Rank

56 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

PRESEnCE: PeRformance Evaluation of SErvices on the Cloud

CSPs Ranking

PRESEnCE: Services-level-based Ranking

Use QoS criterion to rank the Cloud Service Providers (CSPs)Based on the assured SLAs vs. PRESEnCE-assessed SLAs:

→ comparative analysis between CSPs→ allowed by PRESEnCE SaaS performance evaluation

MCDA Objective

AHPPair-wise comparison of elementsstructured in a hierarchical relationship

TOPSISCriteria based selection of an alternativethat is closest to the ideal solution

[CloudCom17] A. Ibrahim & .al, Self-regulated MCDA: An autonomous brokerage-based approach for service provider ranking in the cloud.

56 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

SLA checker Module

virtual QoS aggregator

Conclusion and Perspectives

Summary

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

57 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

Conclusion

Summary: Cloud Computing (CC) SaaS and SLAs

CC: successful and easy to use distributed computing paradigm→ Typical deployment model: SaaS, PaaS, and IaaS→ SaaS is the most used model in the cloud market

Cloud Services offered through Service Level Agreements→ SLAs define both services levels and services penalties

58 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

Conclusion

Summary: Cloud Computing (CC) SaaS and SLAs

CC: successful and easy to use distributed computing paradigm→ Typical deployment model: SaaS, PaaS, and IaaS→ SaaS is the most used model in the cloud market

Cloud Services offered through Service Level Agreements→ SLAs define both services levels and services penalties

YET No standard mechanism to verify and assure that the delivered servicessatisfy the signed SLA agreement in

→ an automatic way→ outside of Cloud Service Providers awareness

X measure accurately the Quality of Service (QoS)X i.e., without giving the chance to the CSP to change the allocated resources

58 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

In this Ph.D Thesis

Extensive study of SaaS performance and SLA complianceProposed PRESEnCE framework:

→ for monitoring, modeling and evaluating the performance of cloud SaaS web services

All frameworks implemented and validated on real case scenarios

59 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

In this Ph.D Thesis

Extensive study of SaaS performance and SLA complianceProposed PRESEnCE framework:

→ for monitoring, modeling and evaluating the performance of cloud SaaS web services

All frameworks implemented and validated on real case scenarios

Summary of Contributions

Multi-criteria evaluation of SaaS Web Services→ distributed PRESEnCE agents tied to KPIs→ Accurate Models for the Web Services metrics

Optimization model obfuscating CSPs evaluation campaigns→ using Meta-heuristics and hybrid GA/machine learning

Probability-based model for SLA breaches detectionServices-levels-based ranking for the CSPs

59 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

Conclusion

Workload / SLA analysis

Performance Evaluation

On-demand evaluation of

SaaS Web Services

across Multi-Cloud Providers

based on: PRESEnCE

SL

A/Q

oS

Va

lida

to

rPredictiveanalytics

AnalyzeRanking

WS Performance

Evaluation

Stealth module dynamic load adaptation

Modeling modulepredictive monitoring

SLA checker modulevirtual QoS aggregator

Agent / metric 1 Agent / metric 2 Agent / metric k

Example: Redis, Memcached,

MongoDB, PostgreSQL etc.

Web Service A

Cloud Provider n

Web Service A

Cloud Provider 1

Client cA1

Client cA2

Client cAn

Client cB1

Client cB2

Client cBm

[Distributed] PRESEnCE Client c’ (Auditor)

60 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

Conclusion

Modeling

CSC 1

PRESENCE

Auditor

MonitoringCSC 2

CSC n

CSP 1

Normal Trace (Workload)

ORACLE

Stealth

SLAs

QoS

Assurance

CSP n

QoS-based

Ranking

CSP n

CSP 1

Deployed

Services

Under Evaluation

Test

Assure

Rank

Stealth Testing Trace (Workload)

PR

ES

ENC

E

60 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

Perspectives

PRESEnCE Framework

Extend testing for other cloud deployment models→ PaaS, IaaS

Evaluate and monitor other KPIs (not only performance):→ Security KPIs→ Cost KPIs

61 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

Perspectives

PRESEnCE Framework

Extend testing for other cloud deployment models→ PaaS, IaaS

Evaluate and monitor other KPIs (not only performance):→ Security KPIs→ Cost KPIs

SLAs

Smart SLA→ smart detection for SLA breaches or violations→ instant penalization & compensation

61 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

Perspectives

PRESEnCE Framework

Extend testing for other cloud deployment models→ PaaS, IaaS

Evaluate and monitor other KPIs (not only performance):→ Security KPIs→ Cost KPIs

SLAs

Smart SLA→ smart detection for SLA breaches or violations→ instant penalization & compensation

⇒ PRESEnCE Launching as Commercial product

61 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

List of Publications I

Publication category Quantity

Journal 1Intl. Conferences 7Book Chapters 1Intl. Workshops 3

Technical Reports 1

Journal (1)

1. A. A.Z.A. Ibrahim, M. U. Wasim, S. Varrette, and P. Bouvry. Presence: Monitoring and modelling the performance metrics of mobile cloudSaaS web services. Mobile Information Systems, 2018.

International Conferences (7)

2. A. A.Z.A. Ibrahim, D. Kliazovich, and P. Bouvry. Service Level Agreement Assurance between Cloud Services Providers and CloudCustomers. 16th IEEE/ACM Intl. Symp. on Cluster, Cloud, and Grid Computing (CCGrid 2016), pages 588 - 591, 2016.

3. A. A.Z.A. Ibrahim, D. Kliazovich, and P. Bouvry. On Service Level Agreement Assurance in Cloud Computing Data Centers. 9th IEEE Intl.Conf. on Cloud Computing (Cloud 2016), pages 921 - 926, 2016.

62 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

List of Publications II

4. A. A.Z.A. Ibrahim, D. Kliazovich, P. Bouvry, and A. Oleksiak. Using virtual desktop infrastructure to improve power efficiency in Grinfysystem. 8th IEEE Intl. Conf. on Cloud Computing Technology and Science (CloudCom’16), pages 85 - 89, 2016.

5. M. U. Wasim, A. A.Z.A. Ibrahim, P. Bouvry, and T. Limba. Law as a Service (LaaS): Enabling legal protection over a blockchain network.14th Intl. Conf. on Smart Cities: Improving Quality of Life Using ICT & IoT (HONET-ICT’17), pages 110 - 114. 2017.

6. M. U. Wasim, A. A.Z.A. Ibrahim, P. Bouvry, and T. Limba. Self-regulated multi-criteria decision analysis: An autonomous brokerage-basedapproach for service provider ranking in the cloud. 9th IEEE Intl. Conf. on Cloud Computing Technology and Science (CloudCom’17), pages33 - 40. 2017.

7. A. A.Z.A. Ibrahim, S. Varrette, and P. Bouvry. PRESEnCE: Toward a Novel Approach for Performance Evaluation of Mobile Cloud SaaS

Web Services. 32nd IEEE Intl. Conf. on Information Networking (ICOIN 2018), 2018. Best Paper Award

8. A. A.Z.A. Ibrahim, M. U. Wasim, S. Varrette, and P. Bouvry. Performance metrics models for cloud SaaS web services. 11th IEEE Intl.Conf. on Cloud Computing (CLOUD’18), pages 936 - 940. 2018.

Book Chapters (1)

9. P. Bouvry, S. Varrette, M.U. Wasim, A. A.Z.A. Ibrahim, X. Besseron, and TA Trinh. Security, reliability and regulation compliance inultrascale computing system. Chapter 3. Ultrascale Computing Systems, pages 65 - 83. 2018.

63 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Conclusion and Perspectives

List of Publications III

Workshops (3)

10. A. A.Z.A. Ibrahim. PRESEnCE: A framework for monitoring, modelling and evaluating the performance of cloud SaaS web services. 48thAnnual IEEE/IFIP Intl. Conf. on Dependable Systems and Networks Workshops (DSN-W’18), pages 83 - 86. 2018.

11. A. A.Z.A. Ibrahim, S. Varrette, and P. Bouvry. On verifying and assuring the cloud SLA by evaluating the performance of SaaS web servicesacross multi-cloud providers. DSN-W’18, pages 69 - 70. 2018.

12. A. A.Z.A. Ibrahim, D. Kliazovich, P. Bouvry, and A. Oleksiak. Virtual desktop infrastructures: architecture, survey and green aspects proof ofconcept. 7th Intl. Green and Sustainable Computing Conf. (IGSC’16), pages 1 - 8. 2016.

Technical Reports (1)

13. A. A.Z.A. Ibrahim. Best Practices for Cloud Migration and Service Level Agreement Compliances. Dell EMC (Internally), 2019

64 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Thank you for your attention

Abdallah Ali Zainelabden Abdallah IbrahimUniversity of Luxembourg, Belval Campus:Maison du Nombre, 4th floor2, avenue de l’UniversitéL-4365 Esch-sur-Alzettemail: [email protected]

1 Context & Motivations

2 PRESEnCE: PeRformance Evaluation of SErvices on the CloudModeling ModuleStealth ModuleSLA Checker Module

3 Conclusion and Perspectives

Publication category Quantity

Journal 1Intl. Conferences 7Book Chapters 1Intl. Workshops 3

Technical Reports 1

65 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Backup Slides / Appendix

66 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Backup Slides / Appendix

PRESEnCE Monitoring: Memtier

0 2000 4000 6000 8000 10000

010

20

30

40

Number of Requests

Late

ncy(s

)

Server Restarted

Memtier−Bench

Redis

Memcached

0 2000 4000 6000 8000 10000

0e+

00

1e+

05

2e+

05

3e+

05

4e+

05

5e+

05

6e+

05

Number of Requests

Thro

ughput(

ops/s

ec)

Server Restarted

Memtier−Bench

Redis

Memcached

0 2000 4000 6000 8000 10000

05000

10000

15000

20000

Number of Requests

Tra

nsfe

r R

ate

(Kb/s

ec)

Server Restarted

Memtier−Bench

Redis

Memcached

67 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Throughput Update Latency Transfer Rate

Backup Slides / Appendix

PRESEnCE Models : ycsbMetric Distribution Model Expression

Throughput Beta

−0.001 + 1 ∗ BETA(4.41, 2.48)

where

BETA(β, α)β = 4.41α = 2.48Offset = −0.001

f (x) =

xβ−1(1−x)α−1

B(β,α)for 0 < x < 1

0 otherwise

where β is the complete beta function given by

B(β, α) =∫ 1

0tβ−1(1 − t)α−1dt

Latency Read Beta

−0.001 + 1 ∗ BETA(1.64, 3.12)

where

BETA(β, α)β = 1.64α = 3.12Offset = −0.001

f (x) =

xβ−1(1−x)α−1

B(β,α)for 0 < x < 1

0 otherwise

where β is the complete beta function given by

B(β, α) =∫ 1

0tβ−1(1 − t)α−1dt

Latency Update Normal

NORM(0.311, 0.161)

where

NORM(meanµ, stdDevσ)µ = 0.311σ = 0.161

f (x) = 1

σ√

2πe

−(x−µ)2

2σ2 for all real x

68 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Memcached Server

Backup Slides / Appendix

PRESEnCE Models : pgBench

Metric Distribution Model Expression

Throughput - -

Latency Log Normal

−0.001 + LOGN(0.212, 0.202)

whereLOGN(logMeanµ, LogStdσ)µ = 0.212σ = 0.202Offset = −0.001

f (x) =

{

1

σx√

2πe

−(ln(x)−µ)2

2σ2 for x > 0

0 otherwise

69 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Postgresql Server

Backup Slides / Appendix

PRESEnCE Models : HTTP load

Metric Distribution Model Expression

Throughput - -

Latency Beta

−0.001 + 1 ∗ BETA(1.55, 3.46)

whereBETA(β, α)β = 1.55α = 3.46Offset = −0.001

f (x) =

xβ−1(1−x)α−1

B(β,α)for 0 < x < 1

0 otherwise

where β is the complete beta function given by

B(β, α) =∫ 1

0tβ−1(1 − t)α−1dt

70 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Apache Server

Backup Slides / Appendix

PRESEnCE Sensitivity Analysis

No_oper No_rec Threads

0.0

0.2

0.4

0.6

0.8

1.0

main effectinteractions

Throughput

No_oper No_rec Threads

0.0

0.2

0.4

0.6

0.8

1.0

main effectinteractions

Read Latency

No_oper No_rec Threads

0.0

0.2

0.4

0.6

0.8

1.0

main effectinteractions

Update Latency

No_oper No_rec Threads

0.0

0.2

0.4

0.6

0.8

1.0

main effectinteractions

Clean−up Latency

71 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

N

Backup Slides / Appendix

PRESEnCE: Testing CampaignStart time Disruption after Benchmark/Service # Operations # Records # Threads

4 79 ycsb-redis 100 1000 213 69 ycsb-Mongodb 100 500 223 27 ycsb-memcached 200 500 229 61 memtier-redis 200 300 235 19 ycsb-Mongodb 500 1000 253 75 ycsb-redis 500 300 262 29 memtier-Memcached 200 1000 265 23 twitter rpc-redis 100 300 268 30 twitter rpc- memcached 500 500 2

●● ●

● ●● ●

● ●

●● ●

●●

● ●

●● ● ●

● ●

●●

● ●●

●● ●

●●

● ●●

●●

0 20 40 60 80 100

50

01

00

01

50

02

00

02

50

03

00

0

Starting Time

Th

rou

gh

pu

t (o

pe

ratio

n/s

)

50

01

00

01

50

02

00

02

50

03

00

0

50 100 150 200 250 300

Termination Time

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ●● ● ●

●●

● ●

● ●

●●

● ● ● ● ●

●● ●

●●

●● ●

● ●

● ●● ● ●

●● ● ●

05

00

01

00

00

15

00

0

La

ten

cy (

ms)

ycsb−redis

ycsb−memcached ycsb−mongodb

72 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Generated Workload: dstat CPU

●●●●●●●

●●

●●

●●

●●

●●●●●●●●●●

●●

●●●●●●

●●●●●●

●●●

●●●●●●●

●●●●

●●

●●●

●●●●●●

●●

●●●

●●●●

●●

●●●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●●

●●

●●

●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●

●●●●●●●

●●●●●●●

●●●●●●●●●

●●●●●●

●●

●●

●●●

●●●●●●●●●●●●●●●●

●●

●●●

●●●

●●●●●

●●

●●

●●

●●●

●●

●●

●●●●●●●

●●●●●●●●●●●●●●

●●●●●

●●●●●●

●●●●●●●

●●

●●

●●●●●

●●●●●

●●●●

●●●●●●●●

●●●●●●●

●●●●●

●●●●●●●

●●●●●●●●●●

●●●●

●●●●●●●●●●

●●●●●●

●●

●●

●●●

●●

●●

●●

●●●●●

●●●●●●●●●●●

●●●●●●●●

●●●●●

●●●●

●●

●●●●●

●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●

0 100 200 300 400 500 600 700

02

04

06

08

01

00

Time(s)

CP

U u

sa

ge

(%)

●●

●●●●●●●●●

●●

●●

●●●●●●●●●

●●

●●

●●●●●●●●●

●●●●●●●●●

●●●●●●●●●

●●●●●●●●●●

●●

●●

●●●●●●●●●●

●●●●●●●●●

●●●●●●●●●

●●

●●●●●●●●●

●●

●●●●●●●●●

●●●●●●●●●

●●

●●●●●●●●●

●●

●●●●●●●●●

●●

●●●●●●●●●

●●

●●●●●●●●●

●●

●●

●●

●●●●●●●●●●

●●

●●

●●●●●●●●●

●●

●●●

●●●●●●●●●

●●

●●

●●●●●●●●●

●●

●●●●●●●●●

●●●

●●●●●●●●●

●●

●●●●●●●●●

●●

●●

●●●●●●●●●

●●

●●●●●●●●●

●●

●●

●●

●●

●●●●●●●●●

●●

●●

●●●●●●●●●●

●●

●●

●●

●●

●●●●●●●●●

●●

●●●

●●●●●●●●●

●●

●●●●●●●●●

●●●●●●●●●

●●●●●●●●●●

●●

●●

●●

●●●●●●●●●

●●

●●

●●

●●●●●●●●●

Test load

Normal load

73 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

PRESEnCE: The Oracle

Benchmarking

Scenario / Testing

Campaign / Testing

Trace Model

Expected Normal Trace

Model / Behaviour of

the CSP of SaaS Web

Services

Black-box

Optimization

(Meta-heuristics)

PRESENCE

Stealth Module

Oracle

Is it stealth ?

2 Traces

Expected usage

NODistinguishable

More optimization

YES

Non-Distinguishable

74 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

PRESEnCE: The Oracle

Oracle

IF RSS <

Threshold

Calculate Distance

RSS between Normal

& Benchmarks

Traces

Emulating the CSP View

Expected

Normal Usage

Model

PRESENCE

Benchmarks

Yes

No

YES

NO

PRESENCE

Non-Distinguishable

Optimize the

Distance

74 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

Curve-Fitting Problem

Curve-fitting problem

The main objective→ find and construct a curve or a mathematical model→ has the best fit to a row data points and may based on some constraints [CUP12]

Optimization problem→ minimizing the distance between two curves

Fitting a curve can involve :11 interpolation,

22 smoothing, or

33 regressionYi = f (Xi , β) + ei

[CUP12] P. George & al. Numerical methods of curve fitting. Cambridge University Press, 2012

75 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

MinMax problem

Minimax

A rule in decision theory for minimizing the maximum loss→ to define robust solutions [OR09]

Used in decision theory, game theory, statistics and optimization→ for minimizing the the possible loss for a worst case (maximum loss) scenario→ constructing solutions having the best possible performance in the worst case

For solution x ∈ X and scenario s ∈ S:

minx∈X

maxs∈S

F (x , s)

[OR09] H. Aissi & al. Min-max and min-max regret versions of combinatorial optimization problems: A survey. Journal of operational research, 2009

76 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

Curve Definition

Y : Set of normal traces data, Y ={

y1, y2, y3, ..., yn

}

Y : Set of predicted data, Y ={

y1, y2, y3, ..., yn

}

X : Set of variables (PRESEnCE input paramters), X ={

x1, x2, x3, ..., xn

}

Θ: Set of coefficients, Θ ={

θ1, θ2, θ3, ..., θm

}

77 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

Meta-heuristics Algorithm

Genetic Algorithm (GA)

Genetic Algorithms (GAs) are one of the most famous and widely usednature-inspired algorithms in black-box optimization.GA operations starting from:

→ population initialization,→ fitness function definition, selection of parents,→ crossover, and mutation to obtain a new population of solutions.

GA: Convergence

how the residual between the two curves is reduced in each generation

78 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Learning-heuristics Algorithm

Regression

improve the performance by:→ feeding GA with high fitness solutions from the regression

RSS : differences between the actual data and the predicted data

0 10000 20000 30000 40000 50000

3000

4000

5000

6000

Operations/s

Thro

ughput

100 200 300 400 500 600

Time(s)

Testing Trace

GP Regression

GPR +− STD

RSS = 0.1302359

79 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Distance (RSS)

0 2 4 6 8 10

0.2

0.4

0.6

0.8

1.0

GPR Iterations

Resid

ual S

um

of S

quare

s (

RS

S)

80 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Stealth: Methodology

⇒ Proposed approaches for Stealth module

InstancesSingle-objective Evolutionary

AlgorithmsPerformance

Indicator

Configurations{1,2,3,4,5,6,7,8,9}

Meta-heuristics Algorithm:Genetic Algorithm (GA)

Convergence

Learning-heuristics Algorithm:Hybrid Algorithm

Convergence

81 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

GA Convergence: Means configs 1:6

2 4 6 8

28

80

02

90

00

29

20

02

94

00

29

60

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

5 10 15

29

20

02

93

00

29

40

02

95

00

29

60

02

97

00

29

80

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 10 20 30 40 50

30

00

03

05

00

31

00

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 20 40 60 80 100

28

80

02

90

00

29

20

02

94

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 50 100 150 200

29

00

02

95

00

30

00

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 100 200 300 400 500

29

00

02

95

00

30

00

03

05

00

31

00

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

82 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

GA Convergence: STD configs 1:6

2 4 6 8

85

09

00

95

01

00

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

5 10 15

72

07

40

76

07

80

80

08

20

84

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 10 20 30 40 50

13

00

14

00

15

00

16

00

17

00

18

00

19

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 20 40 60 80 100

72

07

40

76

07

80

80

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 50 100 150 200

90

01

00

01

10

01

20

01

30

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 100 200 300 400 500

80

01

00

01

20

01

40

01

60

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

83 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Hybrid Convergence: Means configs 1:6

2 4 6 8

25

40

02

56

00

25

80

02

60

00

26

20

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

5 10 15

26

60

02

68

00

27

00

02

72

00

27

40

02

76

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 10 20 30 40 50

26

00

02

65

00

27

00

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 20 40 60 80 100

20

90

02

10

00

21

10

02

12

00

21

30

02

14

00

21

50

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 50 100 150 200

23

20

02

34

00

23

60

02

38

00

24

00

02

42

00

24

40

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 100 200 300 400 500

26

50

02

70

00

27

50

02

80

00

28

50

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

84 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Hybrid Convergence: STD configs 1:6

2 4 6 8

10

50

11

00

11

50

12

00

12

50

13

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

5 10 15

11

00

12

00

13

00

14

00

15

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 10 20 30 40 50

12

00

14

00

16

00

18

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 20 40 60 80 100

90

09

50

10

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 50 100 150 200

10

00

10

50

11

00

11

50

12

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

0 100 200 300 400 500

80

01

00

01

20

01

40

01

60

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

85 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Paramters Tuning: Means & STD configuration 7

0 20 40 60 80 100

28

60

02

88

00

29

00

02

92

00

29

40

02

96

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

CXPB = 0.8

CXPB = 0.7

CXPB = 0.6

CXPB = 0.5

0 20 40 60 80 100

80

01

00

01

20

01

40

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

CXPB = 0.8

CXPB = 0.7

CXPB = 0.6

CXPB = 0.5

86 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Paramters Tuning: Means & STD configuration 8

0 50 100 150 200

29

00

02

95

00

30

00

03

05

00

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

CXPB = 0.8

CXPB = 0.7

CXPB = 0.6

CXPB = 0.5

0 50 100 150 200

90

01

00

01

10

01

20

01

30

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

CXPB = 0.8

CXPB = 0.7

CXPB = 0.6

CXPB = 0.5

87 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Paramters Tuning: Means & STD configuration 9

0 100 200 300 400 500

29

00

02

95

00

30

00

03

05

00

31

00

03

15

00

32

00

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

CXPB = 0.8

CXPB = 0.7

CXPB = 0.6

CXPB = 0.5

0 100 200 300 400 500

80

01

00

01

20

01

40

01

60

01

80

0

Evaluations

Co

nve

rge

nce

(R

esid

ua

l)

CXPB = 0.8

CXPB = 0.7

CXPB = 0.6

CXPB = 0.5

88 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

Backup Slides / Appendix

Parameters Tuning

Configuration 7 Configuration 8 Configuration 9Expected normal trace FIFA FIFA FIFANumber of generations 10000 10000 10000Population size 20 50 100Number of evaluations 500 200 10Selection process Bi-Tour Bi-Tour Bi-TourCrossover operator 2-point 2-point 2-pointCrossover rate [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8]Mutation operator uniform uniform uniformMutation rate 0.001 0.001 0.001Number of executions 30 30 30

89 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

Parameters Tuning

Configuration 7 Configuration 8 Configuration 9Expected normal trace FIFA FIFA FIFANumber of generations 10000 10000 10000Population size 20 50 100Number of evaluations 500 200 10Selection process Bi-Tour Bi-Tour Bi-TourCrossover operator 2-point 2-point 2-pointCrossover rate [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8] [0.5, 0.6, 0.7, 0.8]Mutation operator uniform uniform uniformMutation rate 0.001 0.001 0.001Number of executions 30 30 30

Performance Indicator for PRESEnCE stealth module⇒ Convergence

89 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

Tuning: Results

CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5

Configuration 7 Configuration 8 Configuration 9

Conve

rgence (

Resid

ual)

28000

28500

29000

29500

30000

30500

Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30Ev = 100 Ev = 100 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 200 Ev = 200 Ev = 500 Ev = 500 Ev = 500 Ev = 500

STD

StdErr

95% Confidence

Interval

Ev −> Evaluations

Ex −> Executions

90 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Lower Convergence

is better

Backup Slides / Appendix

Tuning: Results

CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5 CXPB=0.8 CXPB=0.7 CXPB=0.6 CXPB=0.5

Configuration 7 Configuration 8 Configuration 9

Conve

rgence (

Resid

ual)

28000

28500

29000

29500

30000

30500

Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30Ev = 100 Ev = 100 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 200 Ev = 200 Ev = 500 Ev = 500 Ev = 500 Ev = 500

STD

StdErr

95% Confidence

Interval

Ev −> Evaluations

Ex −> Executions

90 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Lower Convergence

is better

Backup Slides / Appendix

Tuning: Results

CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5Configurations

Mean Std Mean Std Mean Std Mean Stdp-value

Configuration 7 28675.97 94.29658 29032.91 103.4181 28833.34 94.97001 29193.87 78.38976 2.2e − 16Configuration 8 29044.13 146.8183 29323.41 114.111 29182.42 120.0906 29570.19 110.5912 2.2e − 16Configuration 9 29095.63 314.8841 29771.38 167.7833 30179.23 242.6208 30528.43 214.5603 2.2e − 16

91 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Backup Slides / Appendix

Tuning: Results

CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5Configurations

Mean Std Mean Std Mean Std Mean Stdp-value

Configuration 7 28675.97 94.29658 29032.91 103.4181 28833.34 94.97001 29193.87 78.38976 2.2e − 16Configuration 8 29044.13 146.8183 29323.41 114.111 29182.42 120.0906 29570.19 110.5912 2.2e − 16Configuration 9 29095.63 314.8841 29771.38 167.7833 30179.23 242.6208 30528.43 214.5603 2.2e − 16

●●●●●●●●●●●●●●●●

CXPB = 0.5 CXPB = 0.6 CXPB = 0.7 CXPB = 0.8

28600

28800

29000

29200

29400

29600

29800

Conve

rgence (

Resid

ual)

●●●●●

●●●

●●●●

CXPB = 0.5 CXPB = 0.6 CXPB = 0.7 CXPB = 0.8

29000

29500

30000

30500

Conve

rgence (

Resid

ual)

●●

●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●

CXPB = 0.5 CXPB = 0.6 CXPB = 0.7 CXPB = 0.8

29000

29500

30000

30500

31000

31500

32000

Conve

rgence (

Resid

ual)

91 / 65Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Configuration 7 Configuration 8 Configuration 9

Backup Slides / Appendix

Tuning: Results

CXPB = 0.8 CXPB = 0.7 CXPB = 0.6 CXPB = 0.5Configurations

Mean Std Mean Std Mean Std Mean Stdp-value

Configuration 7 28675.97 94.29658 29032.91 103.4181 28833.34 94.97001 29193.87 78.38976 2.2e − 16Configuration 8 29044.13 146.8183 29323.41 114.111 29182.42 120.0906 29570.19 110.5912 2.2e − 16Configuration 9 29095.63 314.8841 29771.38 167.7833 30179.23 242.6208 30528.43 214.5603 2.2e − 16

●●●●●●●●●●●●●●●●

CXPB = 0.5 CXPB = 0.6 CXPB = 0.7 CXPB = 0.8

28600

28800

29000

29200

29400

29600

29800

Conve

rgence (

Resid

ual)

●●●●●

●●●

●●●●

CXPB = 0.5 CXPB = 0.6 CXPB = 0.7 CXPB = 0.8

29000

29500

30000

30500

Conve

rgence (

Resid

ual)

●●

●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●

CXPB = 0.5 CXPB = 0.6 CXPB = 0.7 CXPB = 0.8

29000

29500

30000

30500

31000

31500

32000

Conve

rgence (

Resid

ual)

Crossover Rate ⇒ 0.891 / 65

Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud

PRESEnCE

Stealth Module

dynamic load adaptation

Modeling Module

monitoring/modeling

SLA checker Module

virtual QoS aggregator

Stealth Module

dynamic load adaptation

Configuration 7 Configuration 8 Configuration 9