quantification of integrity michael clarkson and fred b. schneider cornell university ieee computer...

61
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Upload: christiana-thompson

Post on 18-Jan-2018

225 views

Category:

Documents


0 download

DESCRIPTION

What is Integrity? Common Criteria: Protection of assets from unauthorized modification Biba (1977): Guarantee that a subsystem will perform as it was intended Isolation necessary for protection from subversion Dual to confidentiality Databases: Constraints that relations must satisfy Provenance of data Utility of anonymized data …no universal definition Clarkson: Quantification of Integrity3

TRANSCRIPT

Page 1: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Quantification of Integrity

Michael Clarkson and Fred B. SchneiderCornell University

IEEE Computer Security Foundations SymposiumJuly 17, 2010

Page 2: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 2

Goal

Information-theoreticQuantification of

programs’ impact onIntegrity

of Information

[Denning 1982]

Page 3: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 3

What is Integrity?Common Criteria:

Protection of assets from unauthorized modificationBiba (1977):

Guarantee that a subsystem will perform as it was intended

Isolation necessary for protection from subversionDual to confidentiality

Databases:Constraints that relations must satisfyProvenance of dataUtility of anonymized data

…no universal definition

Page 4: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 4

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Page 5: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 5

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Contamination: bad information present in outputSuppression: good information lost from output

…distinct, but interact

Page 6: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 6

Contamination

Goal: model taint analysis

ProgramUser

Attacker

User

Attacker

trusted

untrusted

Page 7: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 7

Contamination

Goal: model taint analysis

Untrusted input contaminates trusted output

ProgramUser

Attacker

User

Attacker

trusted

untrusted

Page 8: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 8

Contamination

u contaminates o

o:=(t,u)

Page 9: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 9

Contamination

u contaminates o

(Can’t u be filtered from o?)

o:=(t,u)

Page 10: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 10

Quantification of ContaminationUse information theory: information is

surprise

X, Y, Z: distributions

I(X,Y): mutual information between X and Y (in bits)

I(X,Y | Z): conditional mutual information

Page 11: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 11

Quantification of Contamination

ProgramUser

Attacker

User

Attacker

trusted

untrusted

Page 12: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 12

Quantification of Contamination

ProgramUser

Attacker

User

Attacker

trusted

untrusted

Uin

Tin Tout

Page 13: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 13

Quantification of Contamination

Contamination = I(Uin,Tout | Tin)

ProgramUser

Attacker

User

Attacker

trusted

untrusted

Uin

Tin Tout

[Newsome et al. 2009]

Dual of [Clark et al. 2005, 2007]

Page 14: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 14

Example of Contamination

o:=(t,u)

Contamination = I(U, O | T) = k bits

if U is uniform on [0,2k-1]

Page 15: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 15

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Contamination: bad information present in outputSuppression: good information lost from output

Page 16: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 16

Program Suppression

Goal: model program (in)correctness

Sender Receivercorrect

Specification

(Specification must be deterministic)

Page 17: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 17

Program Suppression

Goal: model program (in)correctness

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

SpecificationSender Receivercorrect

real

Page 18: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 18

Program Suppression

Goal: model program (in)correctness

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Sender Receiver

Implementation might suppress information about correct output from real output

real

correctSpecification

Page 19: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 19

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

Spec.

a[0..m-1]: trusted

Page 20: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 20

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

Spec.

Impl. 1

Suppression—a[0] missing

No contamination

a[0..m-1]: trusted

Page 21: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 21

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

for (i=0; i<=m; i++) { s := s + a[i]; }

Spec.

Impl. 1 Impl. 2

Suppression—a[0] missing

No contamination

Suppression—a[m] added

Contamination

a[0..m-1]: trusted

a[m]: untrusted

Page 22: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 22

Suppression vs. Contamination

*

Attacker

*

Attacker

Contamination

Suppression

output := input

Page 23: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 23

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

SpecificationSender Receiver

Page 24: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 24

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

SpecificationIn Spec

Sender Receiver

Page 25: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 25

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

Uin

Tin Impl

In SpecSender Receiver

Page 26: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 26

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

Uin

Tin

Program transmission = I(Spec , Impl)

In Spec

Impl

Sender Receiver

Page 27: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 27

Quantification of Program SuppressionH(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Page 28: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 28

Quantification of Program SuppressionH(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)Total info to learn about Spec

Page 29: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 29

Quantification of Program SuppressionH(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Info actually learned about

Spec by observing Impl

Total info to learn about Spec

Page 30: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 30

Quantification of Program SuppressionH(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Info actually learned about

Spec by observing Impl

Total info to learn about Spec Info NOT learned

about Spec by observing Impl

Page 31: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 31

Quantification of Program SuppressionH(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Program Suppression = H(Spec | Impl)

Page 32: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 32

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

for (i=0; i<=m; i++) { s := s + a[i]; }

Spec.

Impl. 1 Impl. 2

Suppression = H(A) Suppression ≤ H(A)A = distribution of individual array elements

Page 33: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 33

Belief-based Metrics

What if user’s/receiver’s distribution on unobservable inputs is wrong?

Belief-based information flow [Clarkson et al. 2005]

Belief-based generalizes information-theoretic:

On single executions, the same In expectation, the same …if user’s/receiver’s

distribution is correct

Page 34: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 34

Suppression and ConfidentialityDeclassifier: program that reveals (leaks)

some information; suppresses rest

Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]

Thm. Leakage + Suppression is a constant What isn’t leaked is suppressed

Page 35: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 35

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake

AnonymizerUser

Database

Userquery

response

anonymized response

Page 36: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 36

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake…suppresses to avoid leakage

AnonymizerUser

Database

Userquery

response

anonymized response

anon. resp. := resp.

Page 37: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 37

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake…suppresses to avoid leakage…sacrifices integrity for confidentiality’s

sake

AnonymizerUser

Database

Userquery

response

anonymized response

Page 38: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 38

Database Privacy Security Conditionsk-anonymity: [Sweeney 2002] Every individual must be anonymous within set of size k. Every output corresponds to k inputs.

…no bound on leakage or suppressionL-diversity: [Øhrn and Ohno-Machado 1999, Machanavajjhala et

al. 2007] Every individual’s sensitive information should appear to have L

roughly equally likely values. Every output corresponds to L (roughly) equally likely inputs

…implies suppression ≥ log LDifferential privacy: [Dwork et al. 2006, Dwork 2006] No individual loses privacy by including data in database Output reveals almost no information about individual input

beyond what other inputs already reveal…implies almost all information about individual suppressed …quite similar to noninterference

Page 39: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 39

Summary

Measures of information corruption:Contamination (generalizes taint analysis, dual

to leakage)Suppression (generalizes program correctness,

no dual)

Application: database privacy(model anonymizers; relate utility and privacy; security conditions)

Page 40: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 40

More Integrity Measures Channel suppression

…same as channel model from information theory, but with attacker

Attacker- and program-controlled suppression

Granularity: Average over all executions Single executions Sequences of executions

…interaction of attacker with program

Application: Error-correcting codes

Page 41: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Quantification of Integrity

Michael Clarkson and Fred B. SchneiderCornell University

IEEE Computer Security Foundations SymposiumJuly 17, 2010

Page 42: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 42

Beyond Contamination and Suppression? Clark–Wilson integrity policy Relational integrity constraints Software testing metrics …

Page 43: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 43

Beyond Contamination & Suppression?

*

Attacker

*

Attacker

Page 44: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 44

Confidentiality Dual to Suppression?

Sender

Attacker

Receiver

Attacker

Public inputs lost from public outputs

output := input

Classic duality of confidentiality and integrity is incomplete

public

secret

Page 45: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 45

Value of Information

What if some bits are worth more than others?

Discrete worth: security levelsTop secret, secret, confidential, unclassified

Continuous worth: ?

Page 46: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 46

Bounded Reasoning

Our agents are logically omniscient.

Bounds?Algorithmic knowledge, computational entropy, …

Page 47: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 47

InformationInformation is surprise.

X: random variable on set of events {e, …}

I(e): self-information conveyed by event eI(e) = − log2 Pr[X=e] (unit is bits)

Page 48: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 48

Suppression vs. Contamination

n:=rnd();o:=t xor n

o:=t xor u

o:=(t,u)

t suppressed by noiseno contamination

t suppressed by uo contaminated by u

o contaminated by uno suppression

Page 49: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 49

Echo Specification

output := inputSender Receivertrusted

Tin Tin

Page 50: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 50

Echo Specification

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

output := inputSender Receivertrusted

Tin Tin

Tin Timpl

Uin

Page 51: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 51

Echo Specification

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

output := inputSender Receivertrusted

Tin Tin

Tin Tout

Uin

Page 52: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 52

Echo Specification

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

output := inputSender Receivertrusted

Tin Tin

Tin Tout

Uin

Simplifies to information-theoretic model of channels, with attacker

Page 53: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 53

Channel Suppression

Channel transmission = I(Tin,Tout)Channel suppression = H(Tin | Tout)

(Tout depends on Uin)

ChannelSender

Attacker

Receiver

Attacker

trusted

untrusted

Tin Tout

Uin

Page 54: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 54

Probabilistic Specifications

Correct output: distributions on outputs

Correct output distribution: distribution on distribution on outputs

Continuous distributions, differential entropy?

o := rnd(1)

Page 55: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 55

Duality

Page 56: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 56

Contamination vs. Leakage

ProgramUser

Attacker

User

Attacker

secret

public

ProgramUser

Attacker

User

Attacker

trusted

untrustedContamination = I(Uin,Tout | Tin)

Leakage= I(Sin,Pout | Pin)

Page 57: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 57

Contamination vs. Leakage

Contamination = I(Uin,Tout | Tin)

Leakage= I(Sin,Pout | Pin)[Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005,

2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]

Page 58: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 58

Contamination vs. Leakage

Contamination = I(Uin,Tout | Tin)

Leakage= I(Sin,Pout | Pin)[Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005,

2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]

Contamination is dual to leakage

Page 59: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 59

Confidentiality Dual to Suppression?

No.

Page 60: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 60

L-diversityEvery individual’s sensitive information

should appear to have L (roughly) equally likely values.[Machanavajjhala et al. 2007]

Entropy L-diversity: H(anon. block) ≥ log L[Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007]

H(Tin | tout) ≥ log L (if Tin uniform)…implies suppression ≥ log L

Page 61: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Clarkson: Quantification of Integrity 61

“To Measure is to Know”

When you can measure what you are speaking about…you know something about it;

but when you cannot measure it…your knowledge is…meager and unsatisfactory… You have scarcely…advanced to the state of Science.

—Lord Kelvin