performance and security tradeoff

95
Performance and Security Tradeoff Katinka Wolter Bertinoro, June 26, 2010

Upload: institutfutur

Post on 12-Nov-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

Performance and Security Tradeoff

Katinka Wolter

Bertinoro, June 26, 2010

Table of Contents

Introduction

Performance Cost of Encryption

Performance Evaluation of a Key Distribution Centre

Modelling and Quantifying Intrusion Tolerant Systems

Security of MANETs

Security of the email system

Modelling Performance Security Tradeoff

Conclusions

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 2

motivation

I what does the performance security tradeoff mean?

I we need to measure performance

I we need to measure security

I what are the costs of performance?

I what are the costs of security?

I can we trade one against the other?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 3

performance

classical metricsI throughput

I response time, completion time

evaluation toolsI CTMC

I queueing model

I GSPN, SRN, PEPA

measuresI accumulated reward

I expected reward

I moments of reward

I time to absorption,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 4

Performance versus Security

QuantificationI performance can be measured, quantified

I cost of performance can be quantified

I can we measure security?

I can we determine the cost of security?

I ultimately cost in terms of performance

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 5

Security Cost

It cost British Columbians almost $15 million a day to ensure a peacefulOlympics.

Members of the Vancouver 2010 Olympic Games Integrated Security Unit

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 6

Information Week

April 2007

I Forrester Research survey of 28companies

I Security Breaches Cost $90 To $305 PerLost Record

I 25% respondants do not know how toquantify loss

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 7

security cost

Google

Gmail now can be set to encrypt communications between a browser andGoogle’s servers by default, an option that makes the e-mail service harderto snoop on but also potentially slower.

Google mail

Your computer has to do extra work to decrypt all that data, andencrypted data doesn’t travel across the Internet as efficiently asunencrypted data, that’s why we leave the choice up to you.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 8

IBM slogans

IBM Security Solutions

Manage Risk. Reduce Costs. Enable Innovation.

IBM Virtualisation

Virtualisation Security Solutions from IBM Internet Security SystemsTM

Manage the risks of virtualisations and realise the cost savings.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 9

IBM security

IBM cloud computing security

IBM offers end-to-end solutions that enable you to take a business-drivenand holistic approach to securing your cloud computing environment.IBM’s capabilities empower you to dynamically monitor and quantifysecurity risks, enabling you to better:

I understand threats and vulnerabilities in terms of business impact,

I respond to security events with security controls that optimizebusiness results,

I prioritize and balance your security investments.

IBM Security Solutions for Data Centers

Your company can build a secure, dynamic information infrastructure thathelps you accelerate innovation while reducing cost and complexity ofsecurity.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 10

energy costs

IT costsI total energy costs of FUB 10 M Euro

I electricity 50%

I power consumption of FUB’s central IT services

I how much redundancy, security is necessary?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 11

security concerns are not new

ProblemsI cost of security incident unknown

I incidents may not be detected

I information security aims to get close to theoretical max. withoutknowing the cost.

I security risks may have very low probability. Don’t invest close topotential damage to prevent, but detect.

Source: A Structured Ap-

proach to Computer Security,

T. Olovsson (1992)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 12

Information Security

CIA Properties

I Confidentiality(information is not passed tounauthorised parties,defense)

I Integrity(information is not modifiedby unauthorised parties,banking)

I Availability(information is atdisposition, telephone)

I (non-repudiation)sender and receiver areauthentic

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 13

security versus dependability

analogies

I error, fault, failure in dependability

I vulnerability, security fault (Trojan hoarse), security failure

I failures can be modelled as random processes

differencesI accidental problems in dependability

I intentional problems in security

I attacker accumulates reward

I redundancy is helpful in dependability, detrimental for security

referencesI Littlewood, Brocklehurst, Fenton, Mellor, Page, Wright (1993)

I Littlewood, Strigini (2004), Nicol, Sanders, Trivedi (2004)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 14

weak hypothesis

survey of security quantification

I Verendel 2009: survey of 90 papers between 1981 and 2008.

I includes hardly model-based analysis

I it is unclear whether the methods applied are appropriate

I quantitative analysis needs large numbers of results

I solid, empirical data is necessary, hence

I Quantified Security is a Weak Hypothesis

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 15

weak hypothesis

survey of security quantification

I Verendel 2009: survey of 90 papers between 1981 and 2008.

I includes hardly model-based analysis

I it is unclear whether the methods applied are appropriate

I quantitative analysis needs large numbers of results

I solid, empirical data is necessary, hence

I Quantified Security is a Weak Hypothesis

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 15

security engineering

prevention

protect data and communication to avoid security breaches

diagnosis/detection

identify whether and when a security incident has happened

response

stop attack from causing further damage

recovery

recover from security breach, rekey, use backup data

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 16

security metrics

metrics for security in analogy with dependability metrics

tt1 t2td1 td2tr1 tr2

TBI

TTID

TTIR

TBDR

I TBI: Time Between Incidents

I TTID: Time To Incident Discovery

I TTIR: Time To Incident Recovery

I TBDR: Time Between Detection and Recovery

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 17

simple Markovian security model

parameterise using

I inverse of MTBSI as rate of the fail transition

I inverse of MTTID as rate of the detect transition

I inverse of MTBDR as rate of the recover transition.

The states relate to prevention, diagnosis, recovery.Open question: how do we know the rates?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 18

Performance Cost of Encryption

Introduction

Performance Cost of Encryption

Performance Evaluation of a Key Distribution Centre

Modelling and Quantifying Intrusion Tolerant Systems

Security of MANETs

Security of the email system

Modelling Performance Security Tradeoff

Conclusions

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 19

performance cost of encryption

experiments

I experimental study, no model

I investigation of different algorithms for symmetric and asymmetricencryption

I investigation of different implementations

I encryption of 1,137 byte plaintext file

I keylength: DES 56bit, DESede (Triple DES) 112, Skipjack 80, 128 allothers

I results for symmetric and asymmetric algorithms include keygeneration, algorithm initialization and message encryption times

C. Lamprecht, A. van Moorsel, P. Tomlinson, and N. Thomas. Investigating the

efficiency of cryptographic algorithms in online transactions. International Journal

of Simulation: Systems, Science & Technology, 7(2):63–75, 2006.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 20

performance of Sun JCE implementation

I encryption times range between 85ms and 180ms

I triple DES (DESede) hardly slower than DES

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 21

performance of Java Cryptix implementation

I encryption times range between 15ms and 50msI AES = Rijndael hardly slower than DESI triple DES (DESede) slightly slower than DES

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 22

conclusions for symmetric encryption

performance versus security

I IDEA and Cryptix implementation seem to be best

I security measured in key length ⇒ DES and Skypjack less secure

I security and cost do not correlate

I implementation matters

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 23

asymmetric encryption

public key cryptography

I encrypt with destinations public key

I receiver decrypts with private key

I avoids problem of secure key transmission

I security increases with key length

I current security standard RSA-1024

I measurement of key generation and encryption time

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 24

speed of public key encryption

I DSA only provides non-repudiation, no data confidentialityI Diffie-Hellman 1024 is omitted for clarity

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 25

message digest

Cost of different algorithms to produce a message digest

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 26

summary encryption cost

symmetric encryption

IDEA is fastest

asymmetric encryption

best were:RSA-1024 for public key encryptionSHA-256 for hashing (producing a digest)

performance security tradeoff

There is no indication that the recommendations provide a good tradeoff

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 27

Performance Evaluation of a Key DistributionCentre

Introduction

Performance Cost of Encryption

Performance Evaluation of a Key Distribution Centre

Modelling and Quantifying Intrusion Tolerant Systems

Security of MANETs

Security of the email system

Modelling Performance Security Tradeoff

Conclusions

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 28

Performance Evaluation of a Key DistributionCentre (Zhao, Thomas)

performance of authentication algorithm

I key distribution for secure access to resources

I key distribution for secure communication

I stochastic process algebra model for the Needham-Schroeder protocol(Kerberos) from [Zhao&Thomas09]

questions

1. how many clients can a given KDC configuration support?

2. how much service capacity must we provide at a KDC to satisfy agiven number of clients?

3. how long can a key be used before it is insecure?

Y. Zhao and N. Thomas, Efficient solutions of a PEPA model of a key distribution

centre, Performance Evaluation, 67(2010), pp. 740–756

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 29

2.Performance-Evaluation of aKey-Distribution Centre (Zhao, Thomas)

1. Alice −→ KDC : A,B,N1

2. KDC −→ Alice :{KS ,A,B,N1, {KS , IDA}KB

}KA

3. Alice −→ Bob : {KS , IDA}KB

4. Bob −→ Alice : {N2}KS

5. Alice −→ Bob : {f (N2)}KS

I N1 and N2 are nonces (random itemsof data).

I IDA is a unique identifier for Alice.

I f (N) is a predefined function appliedto the nonce N.

?��

SSSSwS

SSSo

-Alice Bob

KDC

12

3

54

I Alice and KDC share a keyKA

I Bob and KDC share a keyKB

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 30

scalability

?

6

Alice1

Bob1��?

6

Alice2

Bob2��

?

6

AliceN

BobN��

��

���=�����>�� S

SSSwSSSSo

KDC

does it scale

modelling N pairs of Alice and Bob

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 31

PEPA model

For N = 1

KDCdef= (request,>).(response, rp).KDC

Alicedef= (request, rq).(response,>).Alice ′

Alice ′def= (sendBob, rB).(sendAlice,>).(confirm, rc).Alice ′′

Alice ′′def= (usekey , ru).Alice

Bobdef= (sendBob,>).(sendAlice, rA).(confirm,>).Bob′

Bob′ def= (usekey ,>).Bob

Systemdef= KDC BC

LAlice BC

KBob

where, L = {request, response},K = {sendBob, sendAlice, confirm, usekey}.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 32

server utilisation of key distribution centre

a number of simplifications and approximations lead to results.

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

2 4 6 8 10 12 14 16 18 20 22

N

U

simulation,rp=1

simulation,rp=2

simulation,rp=3

simulation,rp=4

approximation,rp=1

approximation,rp=2

approximation,rp=3

approximation,rp=4

average utilisation versus the number of client pairs. ru = 1.1,rA = rB = rc = rq = 1.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 33

response time of key distribution centre

0

2

4

6

8

10

12

14

16

18

2 4 6 8 10 12 14 16 18 20 22

N

W

simulation,rp=1

simulation,rp=2

simulation,rp=3

simulation,rp=4

approximation,rp=1

approximation,rp=2

approximation,rp=3

approximation,rp=4

average response time versus the number of client pairs. ru = 1.1,rA = rB = rc = rq = 1.

1. how many clients can a given KDC configuration support?

2. how much service capacity must we provide at a KDC to satisfy agiven number of clients?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 34

response time of key distribution centre

0

2

4

6

8

10

12

14

16

18

2 4 6 8 10 12 14 16 18 20 22

N

W

simulation,rp=1

simulation,rp=2

simulation,rp=3

simulation,rp=4

approximation,rp=1

approximation,rp=2

approximation,rp=3

approximation,rp=4

average response time versus the number of client pairs. ru = 1.1,rA = rB = rc = rq = 1.

1. how many clients can a given KDC configuration support?

2. how much service capacity must we provide at a KDC to satisfy agiven number of clients?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 34

server utilisation of key distribution centre

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0.01 0.02 0.03 0.04 0.05

ru

Urp=1rp=2rp=3rp=4rp=5

average utilisation varied against the rate of session key use, ru.rq = rA = rB = rc = 1, N = 150.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 35

server response time of key distribution centre

0

10

20

30

40

50

60

70

80

90

100

110

120

130

0.01 0.02 0.03 0.04 0.05

ru

W

rp=1rp=2rp=3rp=4rp=5

average response time varied against the rate of session key use, ru.rq = rA = rB = rc = 1, N = 150.

3. how long can a key be used before it is insecure?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 36

server response time of key distribution centre

0

10

20

30

40

50

60

70

80

90

100

110

120

130

0.01 0.02 0.03 0.04 0.05

ru

W

rp=1rp=2rp=3rp=4rp=5

average response time varied against the rate of session key use, ru.rq = rA = rB = rc = 1, N = 150.

3. how long can a key be used before it is insecure?,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 36

Performance evaluation of key distributioncentre

Summary

I utilisation, response time of KDC increase with number of clients

I shorter use of session key increases security

I shorter use of session key increases utilisation and response time ofKDC

but

I parameters do not translate to a system

I tradeoff between performance and security is not formulated

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 37

Models for Software-System Security

Introduction

Performance Cost of Encryption

Performance Evaluation of a Key Distribution Centre

Modelling and Quantifying Intrusion Tolerant Systems

Security of MANETs

Security of the email system

Modelling Performance Security Tradeoff

Conclusions

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 38

Modelling Intrusion Tolerant Systems

security of intrusion tolerant system

I abstract model for system security

I purpose is to describe and quantify security

I compromise of confidentiality

I compromise of data integrity

I denial of service attacks

I description of security state

I stochastic process with levels of security

B. B. Madan, K. Goseva-Popstojanova, K. Vaidyanathan and K. S. Trivedi. A

Method for Modeling and Quantifying the Security Attributes of Intrusion

Tolerant Systems, Performance Evaluation (2004), 56, pp. 167–186.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 39

states of the model

good state

preserved through

I authentication, access control, encryption

I firewalls, proxy servers

I strong configuration management, upgrades for known vulnerabilities

vulnerable state

reached through

I penetration

I exploration phases of an attack.

active attack stateI potential damage

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 40

more states

several degraded states

I masking through redundancy, backups (MC)

I restauration/reconfiguration possible (graceful degradation, GD) tohandle DoS

I fail-secure to preserve confidentiality, integrity (FS)

several failed statesI intrusion detection fails (undetected compromised state, UC) (false

negative)

I fail with alarm (F) (true positive)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 41

counter measures

design and implementation of intrusion tolerant system

I error detection

I damage assessment

I error recovery, updates (redundancy)

I fault treatment

recovery states

I graceful degradation prevents denial-of-service attack

I stop system to protect confidentiality or data integrity

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 42

state-transition model

possible outcome of analysis

where should I invest, depending on attack model?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 43

state-transition model

possible outcome of analysis

where should I invest, depending on attack model?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 43

GSPN model

I unavailable in states FS, F, UC, A = 1− πFS − πF − πUCI for DoS, ADoS = 1− (πF + πUC )I for MTTSF states UC, GD, FS, F are absorbing states, compute time

to absorption in a DTMC.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 44

measures

considered measuresI availability

I mean time to security failure (MTTSF)

parametersI mean sojourn times

hg = 1/2, hV = 1/3, hA = 1/4, hMC = 1/4, hUC = 1/2, hTR = 1/6.

I pa probability of successful attack from vulnerable state

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 45

results: availability

insights

I higher probability of successful attack from vulnerable state pa reducesavailability

I longer mean time in the good state hG increases availability,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 46

results: mean time to security failure

insights

I MTTSF increases with longer mean time in the good state hG

I MTTSF decreases with higher probability of successful attack fromvulnerable state pa. ,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 47

summary

modelling an intrusion tolerant system

I flexibel model that can represent different types of attacks

I quantification of security (considering DoS, confidentiality, integrityattacks)

I inspired by performability analysis

I doubtful parameter choices (planned improvements using SITAR)

I no notion of performance (planned improvements)

I no security cost

I no tradeoff

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 48

Security of MANETs

Introduction

Performance Cost of Encryption

Performance Evaluation of a Key Distribution Centre

Modelling and Quantifying Intrusion Tolerant Systems

Security of MANETs

Security of the email system

Modelling Performance Security Tradeoff

Conclusions

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 49

security of MANETs

I group communication in mobile ad hocnetwork using group key

I intrusion detection system (IDS) checksfor compromised nodes

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 50

security of MANETs

I group communication in mobile ad hocnetwork using group key

I intrusion detection system (IDS) checksfor compromised nodes

I IDS may not detect (false negative)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 50

security of MANETs

I group communication in mobile ad hocnetwork using group key

I intrusion detection system (IDS) checksfor compromised nodes

I IDS may not detect (false negative)

I IDS may erroneously detect (falsepositive)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 50

security of MANETs

I group communication in mobile ad hocnetwork using group key

I intrusion detection system (IDS) checksfor compromised nodes

I IDS may not detect (false negative)

I IDS may erroneously detect (falsepositive)

I IDS may correctly detect

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 50

security of MANETs

I group communication in mobile ad hocnetwork using group key

I intrusion detection system (IDS) checksfor compromised nodes

I IDS may not detect (false negative)

I IDS may erroneously detect (falsepositive)

I IDS may correctly detect and remove

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 50

security of MANETs

I group communication in mobile ad hocnetwork using group key

I intrusion detection system (IDS) checksfor compromised nodes

I IDS may not detect (false negative)

I IDS may erroneously detect (falsepositive)

I IDS may correctly detect and remove

I node is excluded

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 50

security of MANETs

I group communication in mobile ad hocnetwork using group key

I intrusion detection system (IDS) checksfor compromised nodes

I IDS may erroneously detect (falsepositive)

I IDS may correctly detect and remove

I node is excluded

I new node arrives and is included

I key change is necessary to maintainsecure communication

Performance analysis of dynamic group communication systems with intrusion detection integrated with batch rekeying in mobilead hoc networks. J.-H. Cho, I.-R. Chen, and P.-G. Feng. AINAW ’08: Proceedings of the 22nd International Conference onAdvanced Information Networking and Applications – Workshops, pp. 644–649, Washington, DC, USA, 2008.

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 50

rekeying in MANETs

intrusion detectionI voting-based intrusion detection

I byzantine failure, more than 1/3 of nodes compromised

rekeying frequency

I rekeying increases security

I rekeying increases load (cost)

I batch rekeying after n membership changes

optimisation problem

how often to change key for optimal performance and security?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 51

rekeying in MANETs

intrusion detectionI voting-based intrusion detection

I byzantine failure, more than 1/3 of nodes compromised

rekeying frequency

I rekeying increases security

I rekeying increases load (cost)

I batch rekeying after n membership changes

optimisation problem

how often to change key for optimal performance and security?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 51

Petri net model

parameters

I k1 rekey limit on (trusted) join and leave requests

I k2 rekey limit on detected and falsely detected compromised nodes

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 52

measures

performance measure

average response time R of transmitted message

security measure

MTTSF (attacker takes over or system becomes unavailable, more than1/3 compromised nodes)

computation method

I analysis of SPN

I MTTA method (mean time to absorption)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 53

mean time to security failure

parameters

I k1 rekey limit on (trusted) join and leave requests

I k2 rekey limit on detected and falsely detected compromised nodes,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 54

response time

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 55

insights

vary rekeying thresholds

I rekeying limit at 4 join/leave requests seems optimal

I for higher detected/falsely detected limit 2 join/leave requests mightbe better

I either consider less join/leave requests, or less detected/falselydetected nodes?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 56

intrusion detection interval

rekeying strategies

I individual rekeying (after each join, leave, evict event)I threshold-based rekeying

I TAUDT, k1, k2 as aboveI JALDT, k1 = limit on join requests, k2 = limit in leave requests and

evicted nodes.

parameters

I investigate optimal IDS interval (firing time)

I set TAUDT: (k1, k2) = (4,1), JALDT: (k1, k2) = (5,2) (enablingcondition)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 57

optimal intrusion detection time

I TIDS = 480 optimises MTTSF for individual rekeyingI TIDS = 600 optimises MTTSF for threshold-based rekeying

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 58

optimal intrusion detection interval

I TIDS = 600 optimises response time for all rekeying strategies

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 59

conclusions

resultsI security and performance of wireless group communication system

I security is measured in terms of MTTSF

I performance is measured in terms of response time

I intrusion detection threshold and

I intrusion detection interval are chosen as to optimise those measures

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 60

Security of the email system

Introduction

Performance Cost of Encryption

Performance Evaluation of a Key Distribution Centre

Modelling and Quantifying Intrusion Tolerant Systems

Security of MANETs

Security of the email system

Modelling Performance Security Tradeoff

Conclusions

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 61

Security of the email system

considered system

I email system considered a queue

I Inbox, filtering mechanisms, user, ....?

attack types

I gather information (malicious access to mailbox, click on link inmalicious email)

I denial of service (email bombs flood the mail system)

Y. Wang, C. Lin, and Q.-L. Li. Performance Analysis of the Email System under

Three Types of Attacks. Performance Evaluation, 67(6), (June 2010)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 62

multiple queues

parameters

each queue is described by arrival and service time distribution/rate

I emails, M/M/1/N: λ, µ

I Cracking password, M/PH/1/1: αc and (γc , Sc)

I Malicious email, M/PH/1/1: αm and (γm, Sm)

I Email bombs, M/M/1/1: αb, βb ,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 63

Petri net model

performance measure

I queue length

I system availability

security measure

I (availability)

I information leakage probability

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 64

performance measures

availability and queue length

I availability versus arrival rate of email bombs for different damageduration

I average queue length versus email arrival rate ,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 65

security measures

information leakage

I information leakage versus email arrival rate for different arrival ratesof cracking attacks

I information leakage probability versus email bomb arrival rate fordifferent probabilities of obtaining information after cracking thepassword. ,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 66

insights

security of email

I malicious emails are known security concern

I formalisation as finite queueing models doubtful

I provided performance as well as security measures

I availability, queue length, information leakage

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 67

Modelling Performance Security Tradeoff

Introduction

Performance Cost of Encryption

Performance Evaluation of a Key Distribution Centre

Modelling and Quantifying Intrusion Tolerant Systems

Security of MANETs

Security of the email system

Modelling Performance Security Tradeoff

Conclusions

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 68

performance and security model

objective

I separate performance and security models

I combined measures with optima (cf. performability)

I example: encryption of messages (recall Lamprecht et al.)

I assumption: longer keys → more secure, longer encryption time

model specification

I performance model (queue)

I security model (CTMC, ...)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 69

Petri net model

parameters

Parameter Name Value/Delay

generate 2.0send 0.1N 150encrypt 0.1, . . ., 3.4 by 0.1TSI 12.5,25,50,100, . . ., 15100 by 500detect 120recover 360

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 70

measurescombine performance and security

I pure performance measure (throughput)

I pure security measure (prob. secure state)

I combined measures involving costs

Throughput(send) 10 · Pr {#processing > 0}Pr {secure} E [#secure] = Pr {#secure > 0}CPSM Throughput(send) + Pr {secure}Gain 2 · E [#processing IF #secure = 1]Loss −E [#processing IF #insecure = 1]lowCostRevenue 2 · E [#processing IF #secure = 1]−

E [#processing IF #insecure = 1]highCostRevenue E [#processing ] · (2 · E [#secure]−

5 · E [#insecure])

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 71

analysis

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

0 0.5 1 1.5 2 2.5 3 3.5

encryption time

Pr{secure}throughput

Pr{secure} + throughput

resultsI Pr(secure) and throughput both high better metrics (Raj Jain)

I sum is HB as well

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 72

indirect measures

-0.08

-0.06

-0.04

-0.02

0

0.02

0.04

0.06

0.08

0.1

0.12

0 0.5 1 1.5 2 2.5 3 3.5

encryption time

lowCostRevenuehighCostRevenue

penalties

I higher penalty ⇒ lower benefit

I optimum key length is the same

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 73

encryption cost

0.02

0.03

0.04

0.05

0.06

0.07

0.08

0.09

0.1

0.11

0 0.5 1 1.5 2 2.5 3 3.5

encryption time

lowCostRevenuegain

encryption costI cost = revenue - gain

I cost negligible for long keys

I cost of security failure,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 74

simplified Model

separation of performance and security model

I what happens if we keep the submodels completely separate?

I monotonous performance and security measures?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 75

simplified model throughput

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

0 0.5 1 1.5 2 2.5 3 3.5

encryption time

Pr{secure}throughput

Pr{secure} + throughput

combined performance and security measure

I limiting arrival process more pronounced

I throughput unaffected

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 76

simplified model revenue

-0.08

-0.06

-0.04

-0.02

0

0.02

0.04

0.06

0.08

0.1

0 0.5 1 1.5 2 2.5 3 3.5

encryption time

lowCostRevenuehighCostRevenue

-0.02

0

0.02

0.04

0.06

0.08

0.1

0 0.5 1 1.5 2 2.5 3 3.5

encryption time

lowCostRevenuegain

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 77

insights

lessons learntI assumptions made: TSI and encryption time are correlated

I processing discontinues/continues in case of recovery, what about themeasures?

I do we gain information beyond the assumptions made initially?

parameters

I we find optimal parameter settings!!

I how about realistic parameter values?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 78

numerical issues

0

2000

4000

6000

8000

10000

12000

14000

16000

0 0.5 1 1.5 2 2.5 3 3.5

No. it

erat

ions

encryption time

model with inhibitormodel without inhibitor

remember performability

I many iterations needed

I poor accuracy,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 79

numerical issues

0.5

0.6

0.7

0.8

0.9

1

0 2000 4000 6000 8000 10000 12000 14000 16000

Pro

bab

ilit

y o

f se

cure

sta

te

time between security incidents

security model onlywith inhibitor, n = 150

without inhibitor, n = 1without inhibitor, n = 2without inhibitor, n = 3without inhibitor, n = 5

without inhibitor, n = 10without inhibitor, n = 150

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.5 1 1.5 2 2.5 3 3.5

with inhibitor, maxIter 1000with inhibitor, maxIter 2000with inhibitor, maxIter 5000

with inhibitor, maxIter 1000000without inhibitor, maxIter 1000without inhibitor, maxIter 2000without inhibitor, maxIter 5000

without inhibitor, maxIter 1000000

I solution sensitive to queue length

I solution sensitive to no. of iterations

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 80

conclusions

quantify security

I model-based analysis of performance and security is a new fieldalthough the issue has been around for long

I we still have no metric for security, but

I frequent change of key, or ticket increases security

I longer keys for encryption increase security

I performance can be measured using throughput and response time

I tradeoff can be formulated

security statement

I cryptographic algorithms are known to be secure

I security problems are dependability problems (overflow,implementation, failures, etc.)

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 81

conclusions and outlook

model resultsI do we find out something about the system, or about the model?

I setting up a good model is very difficult.

resume

do we lie with stochastics?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 82

conclusions and outlook

model resultsI do we find out something about the system, or about the model?

I setting up a good model is very difficult.

resume

do we lie with stochastics?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 82

conclusions and outlook

model resultsI do we find out something about the system, or about the model?

I setting up a good model is very difficult.

resume

do we lie with stochastics?

,

Katinka Wolter, Performance and Security Tradeoff, SFM’10 82