karl aberer epfl, lausanne, switzerland€¦ · ©2005, karl aberer, school of computer and...

52
©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 Managing Trust in Distributed Environments Karl Aberer EPFL, Lausanne, Switzerland lsirwww.epfl.ch [email protected] Joint work with Zoran Despotovic, EPFL (now at Euro-Docomo Labs, Munich)

Upload: others

Post on 18-Oct-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Managing Trust in Distributed Environments

Karl AbererEPFL, Lausanne, Switzerland

[email protected]

Joint work with Zoran Despotovic, EPFL

(now at Euro-Docomo Labs, Munich)

Page 2: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

EPFL

Page 3: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Sponsors

• Swiss National Centre of Competence in Research on Mobile Information and Communication Systems (NCCR-MICS)(www.mics.ch)– investigates self-organization at all systems layers– mobile ad-hoc, sensor and peer-to-peer networks

• Evergrow - Ever-growing global scale-free networks, their provisioning, repair and unique functions (EU, www.evergrow.org)– next-generation Internet

Page 4: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Overview

1. Trust in P2P Systems2. A Complaint-based Reputation Mechanism3. Trust and Trust Management4. Trust Management Mechanisms5. Applications6. Conclusions

Page 5: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

1. Peer-to-Peer Systems

• Resource Sharing (e.g. media content, comp. resources)– no centralized infrastructure– global scale information systems

Page 6: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Resource Sharing

• What is shared?

content <rdf:Description about='' xmlns:xap='http://ns.adobe.com/xap/1.0/'>

<xap:CreateDate>2001-12-19T18:49:03Z</xap:CreateDate><xap:ModifyDate>2001-12-19T20:09:28Z</xap:ModifyDate><xap:Creator> John Doe </xap:Creator>

</rdf:Description>…

knowledge

bandwidth

storage

processing

Page 7: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Whom to Trust?

• P2P systems– open communities – everyone can join and leave freely– strong feeling of autonomy– no central authority available– contractual agreements and litigation highly inefficient

• Example Freeriding[Adar, Hubermann 2000]– 66% of the peers share

no files– top 1 percent share 37%

of total files shared

Page 8: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Using Reputation

• Reputation systems as a solution to boost trust– Repeated interactions– Past doings predict future performance– Collecting, processing and disseminating of the feedback

interactions

trust

☺ ☺ ☺ ☺ ☺ ☺ ☺

predictedbehavior

Page 9: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Example eBay

• Does it really matter?– eBay reputation profiles predictive of future performance

[Resnick et al., 2002]– Prices positively correlated with the feedback

[Melnik and Alm, 2002]

• What do we miss?– eBay sales figure: $34 billion gross merchandise volume for

2004

Page 10: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Comparing Ebay and P2P

• Commonalities– open community– repeated interactions– possibility to misbehave and misreport

• Differences– no centralized authority to manage reputation information– possibility to manipulate reports of others– reputation data interpreted by humans

• Question: which methods do exist for efficiently managing trust in the absence of any centralized infrastructure?

Page 11: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

2. A Complaint-based Reputation Mechanism

• A heuristic method to manage trust in P2P[Aberer, Despotovic 2001]

• Binary model: two outcomes of interactions– Peers file complaints on their partners' misbehavior – Cheaters also file complaints to hide own misbehavior – Model used in some gaming servers

• Decision whether a peer is honest or cheating reached by analyzing complaints

• Complaints are stored in a peer-to-peer overlay network

• Dishonest peers may manipulate stored complaints

Page 12: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Storage of Complaints

• Peer-to-peer overlay networks– store and retrieve resources (complaints on a peer) identified

by application-specific keys (peer identifiers)– overlay network maintained on top of a physical network

• Two main approaches– unstructured: e.g. Gnutella– structured: e.g. Chord, P-Grid, …

Page 13: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Unstructured Overlay Networks

• Search requests are broadcast (gossipped) to all participants in the network– robust, but inefficient in terms of message traffic

Find complaints on peer icomplaints on peer i

Page 14: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Structured Overlay Networks

• Peers are embedded into the identifier space– Each peer stores complaints on some subset of peers– greedy routing, few messages (logarithmic)– maintenance cost for maintaining the correct links (routing

tables)

peer j stores complaints

about peer i

peer k:complaints

about peer i ?

Page 15: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Example: P-Grid

• Distributed binary search tree

000 001 010 011 100 101 110 111

00? 01? 10? 11?

0?? 1??

???

peer 1 peer 2 peer 3 peer 4

query(peer 1, 101)

query(peer 4, 101)query(peer 3, 101)

found!

Page 16: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Reputation Aggregation

• Idea: if a peer receives substantially above average complaints it is likely to be a cheater– Measure the fraction of cheating interactions fcheated– Retrieve complaints about peer p from all peers storing

complaints about p (replicas) – For each replica evaluate a probabilistically determined

criterion (Chernoff bounds)

• Two algorithms:– Simple – take the simple majority decision– Complex – check trustworthiness of replicas in case of a tie

( ( ))4.32 ( ) (0.84 )

1 -1

normi

normi cheated

cheated

decide c p

c p ff

=

≤ +if

then else

Page 17: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Experimental Evaluation

• Independent cheating (misclassification rate)

Page 18: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Results

• Several positive aspects ☺– Method addresses misbehaviors (complaints)– Method addresses misreporting

(aggregation of complaints from multiple interactions)– Method addresses manipulation of reports (replicas)– Efficient management of reputation data

(structured overlay networks)– Somewhat reliable detection of misbehavior

Page 19: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Open Issues

• What are the assumptions on peer behavior?

• What about is the meaning of being evaluated positively?

• Is the available reputation information used optimally?

• What about collusion?

• Review the foundations!

Page 20: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

3. Trust and Trust Models

• Definition of Trust: Peer a (trustor) trusts peer b(trustee) if through some interaction …– Gain is to be shared with peer b– Peer a is exposed to a risk of loss

• Trust models– Reduce the opportunity of the trustee to exploit the

situation– Reduce vulnerability of the trustor and– Help the trustor to decide if and when to enter an interaction

Page 21: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Reputation-based Trust Management

• Problem: Trust management based on peers’reputations

• Define:– A form of feedback to be taken from interacting peers about

their partners’ trustworthiness (set W )– A strategy to aggregate the available feedback (algorithm TA )

• Output:– An estimate of the trustworthiness of a peer (set T ) so that

peers can decide whether to enter into an interaction

Page 22: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Example

• Multiple interaction contexts, d = direct, r = report• W = {r, d} x [0,1], T = [0,1]• Example: interactions among peer a and peer b

i j

a b

(r,0.8)

(r,v1)

(d,1.0)

(d,0.9)

(r,v2)

v

u

(r,vk-2)(r,vk-1)

(d,vk)

a thinks b reported with quality 0.8

a thinks b performed with quality 1.0

Page 23: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Example cntd.

• i evaluates trustworthiness of j using experiences made by u and v– direct experiences of u and v filtered out using

recommendations along all paths (algorithm TA)

i j

a b

(r,0.8)

(r,v1)

(d,1.0)

(d,0.9)

(r,v2)

v

u

(r,vk-2)(r,vk-1)

(d,vk)

Path 1: f1=f(v1, v2, …, vk)

Path 2: f2=f(…)

Path 3: f3=f(…)

tj=t(f1, f2, f3)

Page 24: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

P2P Reputation System

• Definition: P2P Reputation System• A P2P reputation system is a quadruple (G, W, TA, T):

– G is a directed weighted multigraph (P,V)– P is a set of peers and V the set of edges with weights drawn

from the set W of possible interactions– TA is an algorithm that operates on the graph and outputs a

specific value t ∈ T for any peer

Page 25: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Trust Multigraph vs. P2P Overlay Network

• Trust multigraph != P2P overlay network– Another network on top of P2P overlay network– Data items stored in the underlying P2P network

• Unstructured P2P overlay:– Each peer stores its own outgoing edges– (destination, time) as the key

• Structured P2P overlay:– Stored at peers as dictated by the overlay network– (destination, source, time) as the key

Page 26: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Assumed Peer Behavior

• Rational– Economic model of the underlying interactions– Utility maximization

• Probabilistic– Joint probability distribution on the behavior of peers P– P […,

pk performs wk to pj, …, pl reports wl when pm performs wm,

…] – Marginal P [pi performs wk to pj] being estimated by pj

Page 27: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Peer Behavior Examples

• Rational– Game as the underlying interaction

– Repeat this game against many buyers• Probabilistic

– Peers p1 and p2 collude against p3:• P [p1 misreports on p3 |p2 misreports on p3] != P [p1 misreports on p3 ]• Independent cheating and misreporting

0,01,1

High Quality

0,0No Purchase

-1,2Purchase

LowQuality

Page 28: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

General Problem of Trust Management

• Given– Reputation System defining type of interactions and

feedback– Model of peer behavior

• Find a trust algorithm (TA) that limits the vulnerability of trustors and helps to decide when to enter an interaction

Page 29: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

4. Classification of Trust Management

• Social Network Approaches• Probabilistic Techniques• Game-theoretic approaches

Page 30: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Social Networks

• Ad hoc feedback aggregation– Direct experiences filtered out by trust along paths

• Example: Xiong and Liu, 2004– Average feedback weighted by the feedback sources trust:

• A popular example: PageRank (Google)

( )( )1

( )( )1

incoming ji source ii

j incoming jsource kk

w tt

t=

=

⋅= ∑∑

Page 31: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Social Networks - Main Properties

• Trust semantics– No plausible interpretation of trust values– Probabilistic behaviour not made explicit

• Method performance– Good performance for large populations of “cheaters”– Good performance for a wide range of misbehaviours

• Mainly huge implementation overhead– All available information used– High communication costs and computation overhead– Approximate methods of Xiong and Liu (2004) acceptable

Page 32: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Probabilistic Estimation Techniques

• Probabilistic behaviour assumption made explicit – Determine a set of parameters to estimate– Feedback as a set of samples– Bayesian, maximum likelihood estimation, …

• Bayesian estimation– Prior distributions of the unknown parameters– Posteriori updates on each observation

• Maximum likelihood estimation– Compute the likelihood function of the sample set– Fit the parameter values that maximize it

Page 33: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Maximum Likelihood Estimation

• Example: Despotovic and Aberer (2004)– Binary outcomes of the interactions – Innate probabilities of performing honestly and lying when

reporting to others: θk , lk• Algorithm

– Probability of report yk from peer k on peer j:

– Determine lks by checking reports on own performances– Collect all reports y1, y2, …, yn on peer j and select θj that

maximizes:

=−−+=−+−

==0)1)(1(1)1()1(

][kjkjk

kjkjkkk yifll

yifllyYP

θθθθ

][][][)( 2211 nnj yYPyYPyYPL ==== Lθ

Page 34: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Algorithm Operation

i v j

w

u0

1

0

1

1

0

query(i)query(j)

compute ldo mle

structured overlay

Page 35: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Method Performance

• Various collusion patterns

Page 36: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Maximum Likelihood Estimation (2)

• Model “Normally distributed services”– peers provide services with normally distributed quality– Means µi and lying probabilities li innate to the peers– Liar model: report random quality from service Ni, i random

• Algorithm– Probability of report y from peer k on peer j:

– Determine pl by checking reports on own performances– Collect all reports y1,y2, … , yn on peer j and select θj that

maximizes:

][][][)( 2211 nnj yYPyYPyYPL ==== Lθ

22

2 2

( )( )1 1 12 2

2 21( ) (1 )

ji

k

yy

Y l lif y p e p e

θµσ σ

σ π σ π

−−− −

== + −∑ s

s

NN

Page 37: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Method Performance

• Independent cheating

Page 38: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Probabilistic Estimation – Main Properties

• Trust semantics – Probability distributions over the set of possible behaviours

are determined– Clear decision making – compute utilities

• Method performance– Narrow range of misbehaviours (mainly independent acting)

• Low implementation overhead– Only a small fraction of the available information used– Communication costs to retrieve reports of the witnesses– Low storage costs or computation overhead

Page 39: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Social Networks vs. Probabilistic Estimation

• Pick two representatives: Xiong and Liu (2004), and Despotovic and Aberer (2004) and compare them in the same settings

• Select an appropriate solution based on the properties of the target environment

• To check:– Performance under various:

• Collusive behaviours• Collusive population sizes• Trust graph distributions and sizes

– Implementation overheads

Page 40: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

MLE Performance – collusive behavior

Page 41: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

SN Performance – collusive behavior

Page 42: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Game-theoretic Techniques

• Rational peer behaviour– Economic model of the underlying interactions– Utility maximization

• Game-theoretic framework– Bayesian games – modeling uncertainties– Repeated games – modelling repeated interactions

• Clean decision making – play an equilibrium• Analytic design

– Link between feedback aggregation strategies and resulting behaviour

• Open research problem for P2P!

Page 43: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Solution Classes

Page 44: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

5. Applications

• Safe Exchanges• Peer-to-Peer Auctioning

Page 45: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Safe Exchanges

• Problem: exchange divisible goods against money• An exchange sequence is safe if the expected gain from

completing the transaction exceeds the gain from immediate defection– Pmax(k): maximal payment in step k s.t. seller does not defect– Pmin(k): minimal payment in step k s.t. buyer does not defect

Pmin ‘(3)

Pmin(3)

step

payment

Pmin(1)

Pmin(2)

Pmax(1)

Pmax(2)

Pmax(3)Pmax(4)=Pmin(4)

Page 46: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Trust-aware Safe Exchanges

• [Sandholm 94] provided a condition and algorithm to find a safe sequence if it exists Pmin(k+1) ≤ Pmax(k), (1 ≤ k ≤ n)

• Problem: defection in last step always of advantage• Use trust management to fix this [Despotovic, Aberer 02]

– in addition more flexibility in scheduling

Pmin(3)

step

payment

Pmin(1)

Pmin(2)

Pmax(1)

Pmax(2)

Pmax(3)

} trust

Page 47: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Double Auctioning in P2P

• Problem

• How to determine the competitive equilibrium (CE) price? No one knows the curves!

• Efficient trading: continuous double auctioning

Page 48: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Application: Double Auctioning in P2P

• Continuous double auction (CDA) in P2P– Peers may be tempted to reply to offers and bids not by taking

the observed prices but by offering new, better ones. This cannot be observed!

– How to clear the market in the case of bargaining?

• Open Electronic Bargaining System (OEBS)– Bids can be submitted asynchronously at any point in time – Any observed bid can be replied with a counteroffer or ignored. – In the case the counteroffer is accepted by the bid originator

then it is assumed that a deal is made between two involved parties and that the trading price is set to the middle point between the two offered values

Page 49: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Performance of OEBS

• The bidders use a machine learning strategy (“Zero Intelligence Plus” - modified)

• No malicious bidders• Evaluation of market inefficiency

Page 50: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Performance of OEBS – with Misbehavior

• Malicious bidders take multiple offers and take the best one.– Without a reputation system:

– With a reputation system:

Page 51: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

6. Conclusions

• Reputation management helps building trust• Needed in many application areas

– e.g. Web services, pervasive services• Select an appropriate solution based on

– Expected behavioural patterns in the community– Target application (e.g. file download)– Important properties of the environment (e.g. network size)

• Growing body of work targeting specifically P2P– issues of scale, autonomy, heterogeneity, …

Page 52: Karl Aberer EPFL, Lausanne, Switzerland€¦ · ©2005, Karl Aberer, School of Computer and Communication Sciences ICPS Santorini July 12, 2005 EPFL

©2005, Karl Aberer, School of Computer and Communication Sciences ICPS SantoriniJuly 12, 2005

Research Issues

• Investigating more robust probabilistic models– Merging good properties of social networks and probabilistic

models• Advance in P2P game-theoretic reputation models

– Feedback aggregation as the strategy space constituent• Dynamic changes of behaviour• Lower bounds on achievable quality