wits08

83
1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum http://www.cs.yale.edu/homes/jf WITS’08; Princeton NJ; June 18, 2008 Acknowledgement: Aaron Johnson

Upload: adrianagmihail

Post on 30-Jan-2016

214 views

Category:

Documents


0 download

DESCRIPTION

Modeling and Analysis of Anonymous-Communication Systems

TRANSCRIPT

Page 1: WITS08

1

Modeling and Analysis of Anonymous-Communication Systems

Joan Feigenbaumhttp://www.cs.yale.edu/homes/jf

WITS’08; Princeton NJ; June 18, 2008

Acknowledgement: Aaron Johnson

Page 2: WITS08

2

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

Page 3: WITS08

3

Anonymity: What and Why

• The adversary cannot tell who is communicating with whom. Not the same as confidentiality (and hence not solved by encryption).

• Pro: Facilitates communication by whistle blowers, political dissidents, members of 12-step programs, etc.

• Con: Inhibits accountability

Page 4: WITS08

4

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

Page 5: WITS08

5

Anonymity Systems

• Remailers / Mix Networks– anon.penet.fi– MixMaster– Mixminion

• Low-latency communication– Anonymous proxies, anonymizer.net– Freedom– Tor– JAP

• Data Publishing– FreeNet

Page 6: WITS08

6

Mix Networks

• First outlined by Chaum in 1981

• Provide anonymous communication– High latency– Message-based (“message-oriented”)– One-way or two-way

Page 7: WITS08

7

Mix Networks

Users Mixes Destinations

Page 8: WITS08

8

Mix NetworksAdversary

Users Mixes Destinations

Page 9: WITS08

9

Mix Networks

Users Mixes Destinations

Protocol

Adversary

Page 10: WITS08

10

Mix Networks

1. User selects a sequence of mixes and a destination.

M1

M2

M3

u d

Protocol

Adversary

Users Mixes Destinations

Page 11: WITS08

11

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

M1

M2

M3

u d

Protocol

Adversary

Users Mixes Destinations

Page 12: WITS08

12

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

Adversary

Users Mixes Destinations

Page 13: WITS08

13

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{{{,d}M3,M3}M2

,M2}M1

Adversary

Users Mixes Destinations

Page 14: WITS08

14

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{{{,d}M3,M3}M2

,M2}M1

Adversary

Users Mixes Destinations

Page 15: WITS08

15

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{{,d}M3,M3}M2

Adversary

Users Mixes Destinations

Page 16: WITS08

16

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{,d}M3

Adversary

Users Mixes Destinations

Page 17: WITS08

17

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

Adversary

Users Mixes Destinations

Page 18: WITS08

18

Mix Networks

u d

Adversary

Anonymity?

1. No one mix knows both source and destination.

Users Mixes Destinations

Page 19: WITS08

19

Mix Networks

u d

Adversary

Anonymity?

1. No one mix knows both source and destination.

2. Adversary cannot follow multiple messages through the same mix.

v

f

Users Mixes Destinations

Page 20: WITS08

20

Mix Networks

u d

Adversary

Anonymity?

1. No one mix knows both source and destination.

2. Adversary cannot follow multiple messages through the same mix.

3. More users provides more anonymity.

v e

w f

Users Mixes Destinations

Page 21: WITS08

21

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

Page 22: WITS08

22

Provable Anonymity in Mix Networks

• N users

• Passive, local adversary

– Adversary observes some of the mixes and the links.

– Fraction f of links are not observed by adversary.

• Users and mixes are roughly synchronized.

• Users choose mixes uniformly at random.

Setting

Page 23: WITS08

23

Provable Anonymity in Mix Networks

• Users should be unlinkable to their destinations.

• Let be a random permutation that maps users to destinations.

• Let C be the traffic matrix observed by the adversary during the protocol. Cei = # of messages on link e in round i

Definition

e1

e2

1 2 3 4 5

1 0

0 1

0

1

1 1

0 0

Page 24: WITS08

24

• Use information theory to quantify information gain from observing C.

• H(X) = x -Pr[X=x] log(Pr[X=x]) is the entropy of r.v. X

• I(X : Y) is the mutual information between X and Y.

• I(X : Y) = H(X) – H(X | Y) = x,y -Pr[X=xY=y] log(Pr[X=xY=y])

Provable Anonymity in Mix Networks

Information-theory background

Page 25: WITS08

25

Provable Anonymity in Synchronous Protocols

Definition: The protocol is (N)-unlinkable if I(C : ) (N).

Definition: An (N)-unlinkable protocol is efficient if:

1. It takes T(N) = O(polylog(N/(N))) rounds.

2. It uses O(NT(N)) messages.

Theorem (Berman, Fiat, and Ta-Shma, 2004): The basic mixnet protocol is (N)-unlinkable and efficient whenT(N) = (log(N) log2(N/(N))).

Page 26: WITS08

26

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

Page 27: WITS08

27

Onion Routing [GRS’96]

• Practical design with low latency and overhead

• Connection-oriented, two-way communication

• Open source implementation (http://tor.eff.org)

• Over 1000 volunteer routers

• Estimated 200,000 users

Page 28: WITS08

28

How Onion Routing Works

User u running client Internet destination d

Routers running servers

u d

1 2

3

45

Page 29: WITS08

29

How Onion Routing Works

u d

1 2

3

45

1. u creates 3-hop circuit through routers (u.a.r.).

Page 30: WITS08

30

How Onion Routing Works

u d

1 2

3

45

1. u creates 3-hop circuit through routers (u.a.r.).

Page 31: WITS08

31

How Onion Routing Works

u d

1 2

3

45

1. u creates 3-hop circuit through routers (u.a.r.).

Page 32: WITS08

32

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

u d

1 2

3

45

Page 33: WITS08

33

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{{}3}4}1

u d

1 2

3

45

Page 34: WITS08

34

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{}3}4

u d

1 2

3

45

Page 35: WITS08

35

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{}3

u d

1 2

3

45

Page 36: WITS08

36

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

u d

1 2

3

45

Page 37: WITS08

37

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

’u d

1 2

3

45

Page 38: WITS08

38

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{’}3

u d

1 2

3

45

Page 39: WITS08

39

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{’}3}4u d

1 2

3

45

Page 40: WITS08

40

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{{’}3}4}1

u d

1 2

3

45

Page 41: WITS08

41

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

4. Stream is closed.

u d

1 2

3

45

Page 42: WITS08

42

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

4. Stream is closed.

5. Circuit is changed every few minutes.

u d

1 2

3

45

Page 43: WITS08

43

Adversary

u

1 2

3

45

d

Active & Local

Page 44: WITS08

44

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

Page 45: WITS08

45

Formal Analysis(F., Johnson, and Syverson, 2007)

u 1 2

3

45

d

v

w

e

f

1.

2.

3.

4.

Timing attacks result in four cases:

Page 46: WITS08

46

1. First router compromised

2.

3.

4.

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

Page 47: WITS08

47

1. First router compromised

2. Last router compromised

3.

4.

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

Page 48: WITS08

48

1. First router compromised

2. Last router compromised

3. First and last compromised

4.

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

Page 49: WITS08

49

1. First router compromised

2. Last router compromised

3. First and last compromised

4. Neither first nor last compromised

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

Page 50: WITS08

50

Black-Box, Onion-Routing Model

Let U be the set of users.

Let be the set of destinations.

Let the adversary control a fraction b of the routers.

Configuration C• User destinations CD : U• Observed inputs CI : U{0,1}

• Observed outputs CO : U{0,1}

Let X be a random configuration such that:

Pr[X=C] = u [puCD(u)][bCI(u)(1-b)1-CI(u)][bCO(u)(1-b)1-CO(u)]

Page 51: WITS08

51

Indistinguishabilityu dvw

ef

u dvw

ef

u dvw

ef

u dvw

ef

Indistinguishable configurations

Page 52: WITS08

52

Indistinguishabilityu dvw

ef

u dvw

ef

u dvw

ef

u dvw

ef

Indistinguishable configurations

Note: Indistinguishable configurations form an equivalence relation.

Page 53: WITS08

53

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

Page 54: WITS08

54

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

Note: This is different from the metric of mutual information used to analyze mix nets.

Page 55: WITS08

55

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

Exact Bayesian inference

• Adversary after long-term intersection attack

• Worst-case adversary

Page 56: WITS08

56

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

Exact Bayesian inference

• Adversary after long-term intersection attack

• Worst-case adversary

Linkability given that u visits d:

E[Y | XD(u)=d]

Page 57: WITS08

57

Anonymity Bounds

1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu

d

Page 58: WITS08

58

Anonymity Bounds

1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu

d

2. Upper bounds:a. pv

=1 for all vu, where pv pv

e for e d

b. pvd=1 for all vu

Page 59: WITS08

59

Anonymity Bounds

1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu

d

2. Upper bounds:a. pv

=1 for all vu, where pv pv

e for e d

E[Y | XD(u)=d] b + (1-b) pud + O(logn/n)

b. pvd=1 for all vu

E[Y | XD(u)=d] b2 + (1-b2) pud + O(logn/n)

Page 60: WITS08

60

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Page 61: WITS08

61

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:

Page 62: WITS08

62

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0]

Page 63: WITS08

63

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0]

Page 64: WITS08

64

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.

Page 65: WITS08

65

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]

= i (Pr[Di])2

Pr[Ci] Pr[XD(u)=d]

Page 66: WITS08

66

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]

= i (Pr[Di])2

Pr[Ci] Pr[XD(u)=d]

(i Pr[Di] Pr[Ci] / Pr[Ci])2

Pr[XD(u)=d]by Cauchy-Schwarz

Page 67: WITS08

67

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]

= i (Pr[Di])2

Pr[Ci] Pr[XD(u)=d]

(i Pr[Di] Pr[Ci] / Pr[Ci])2

Pr[XD(u)=d]

= pud

by Cauchy-Schwarz

Page 68: WITS08

68

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0]

Page 69: WITS08

69

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0] b2 + b(1-b) pu

d + (1-b) pud

Page 70: WITS08

70

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0] b2 + b(1-b) pu

d + (1-b) pud

= b2 + (1-b2) pud

Page 71: WITS08

71

Upper Bound

Page 72: WITS08

72

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

Page 73: WITS08

73

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

Show max. occurs when, for all vu, pv

ev = 1 for

some ev.

Page 74: WITS08

74

Show max. occurs when, for all vu,ev = d orev = .

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

Show max. occurs when, for all vu, pv

ev = 1 for

some ev.

Page 75: WITS08

75

Show max. occurs when, for all vu,ev = d orev = .

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

Show max. occurs when, for all vu, pv

ev = 1 for

some ev.

Show max. occurs when ev=d for all vu, or whenev = for all vu.

Page 76: WITS08

76

Upper-bound EstimatesLet n be the number of users.

Page 77: WITS08

77

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

Let n be the number of users.

Page 78: WITS08

78

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

Theorem 5: When pvd=1 for all vu:

E[Y | XD(u)=d] = b2 + b(1-b)pud +

(1-b) pud/(1-(1- pu

d)b) + O(logn/n)]

Let n be the number of users.

Page 79: WITS08

79

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

Let n be the number of users.

Page 80: WITS08

80

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

b + (1-b) pud

Let n be the number of users.

For pu small

Page 81: WITS08

81

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

b + (1-b) pud

E[Y | XD(u)=d] b2 + (1-b2) pud

Let n be the number of users.

For pu small

Page 82: WITS08

82

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

b + (1-b) pud

E[Y | XD(u)=d] b2 + (1-b2) pud

Let n be the number of users.

Increased chance of total compromise from b2 to b.

For pu small

Page 83: WITS08

83

Conclusions

• Many challenges remain in the design, implementation, and analysis of anonymous-communication systems.

• It is hard to prove theorems about real systems – or even to figure out what to prove.

• “Nothing is more practical than a good theory!” (Tanya Berger-Wolfe, UIC)