[ieee 2011 ieee international conference on pervasive computing and communications workshops (percom...

Download [IEEE 2011 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops) - Seattle, WA, USA (2011.03.21-2011.03.25)] 2011 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops) - Analyzing the incentives in Community-based Security Systems

Post on 11-Apr-2017

212 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • Analyzing the Incentives in Community-based Security Systems

    Pern Hui ChiaCentre for Quantifiable Quality of Service in Communication Systems (Q2S)

    Norwegian University of Science and Technology (NTNU)chia@q2s.ntnu.no

    AbstractApart from mechanisms to make crowd-sourcingsecure, the reliability of a collaborative system is dependent onthe economic incentives of its potential contributors. We studyseveral factors related to the incentives in a community-basedsecurity system, including the expectation on the social influ-ence and the contagion effect of generosity. We also investigatethe effects of organizing community members differently ina complete, random and scale-free structure. Our simulationresults show that, without considering any specific incentiveschemes, it is not easy to encourage user contribution in acomplete-graph community structure (global systems). On theother hand, a moderate level of cooperative behavior can becultivated when the community members are organized in therandom or scale-free structure (social networks).

    Keywords-Incentives, Collaborative Security, Game Theory

    I. INTRODUCTIONDespite the popularity of reputation and recommender

    systems, relying on community effort for security purposesis still a new concept to many. PhishTank [1] and Web OfTrust (WOT) [2] are two of the few systems that employuser inputs to improve web security. PhishTank relies onuser reporting and voting against suspected phishes, whileWOT collates user ratings on several trust and security as-pects of websites. Researchers have looked at the reliabilityof Community-based Security Systems (CSS). Moore andClayton [3] argued that as participation in PhishTank followsa power-law distribution, its outputs are particularly suscep-tible to manipulation by the few highly active contributors.

    Indeed, besides system design to prevent manipulation bydishonest users or attackers, the reliability of a CSS is highlydependent on the incentives of its potential contributors.When many users contribute actively, diverse user inputsserve to cancel out the errors made by individuals andscrutinize against manipulative attempts. On the other hand,when many users do not contribute, the outcomes can bebiased towards the judgment of a few, or be manipulatedeasily. In the reverse manner of the tragedy of the commons[4] which depicts the situation whereby individuals consumethe common resource irresponsibly, motivating active partic-ipation in a CSS is a problem of public goods provisioningsuch that individuals face a cost that discourages them fromcontributing to the common protection.

    In this work, we look at several factors related to theincentives in a CSS. We adapt the normalized total-effortsecurity game in [5][6] to depict the scenario of collaborative

    security protection, but our model takes into account ofthe long term user consideration using the framework ofinfinitely repeated games. We first describe the basic modelin Section II and extend it with the expectation on social in-fluence in Section III. We study the effects of user dynamicsand a possible contagion effect of generosity in Section IV.We find that it is easier to encourage a moderate level of usercontribution in a random or scale-free community structurethan in the complete-graph structure of global systems inSection V.

    II. BASIC MODEL & ANALYSIS

    Imagine a community-based system that collates evalua-tion reports from its members on some trust or security as-pects of websites. Inputs from all members are important asthey serve to diversify the inputs and cancel out errors madeby individuals, in addition to enabling a level of scrutiny oneach others inputs. Without a sufficient level of contribution,a CSS will be deemed unreliable and abandoned by itsmembers. An equally undesired scenario arises when thereis a highly skewed participation ratio such that the systemcan be completely undermined when the few highly activeusers become corrupted or stop participating [3].

    A. An Infinitely Repeated Total-effort Security Game

    We use a n-player repeated game to model a CSSconsisting of N rational members. We first consider acomplete-graph community structure, as in global systemslike PhishTank and WOT, where the N = n memberscollaborates at all time (round) t for common protection.Each game round evaluates a different website (target).

    We assume that all members value the benefit of pro-tection b equally and have the same cost of contributionc. Assuming also that the inputs from all members areequally important, we formulate the homogenous utilityUi,t received by member i at game round t to be linearlydependent on the ratio of contribution by the n collaboratingmembers, following the notion of normalized total-effortsecurity in [5][6], as follows:

    Ui,t =

    j aj,t

    nb cai,t (1)

    with ai,t denoting the binary action of either {1: contribute,0: do not contribute} by member i at game round t.

    3rd International Workshop on Security and Social Networking

    978-1-61284-937-9/11/$26.00 2011 IEEE 270

  • When all members contribute, each of them receives autility of b c. If no one but only member i contributes,his utility is b/n c. Assume b > c > b/n such thatcontributing to a CSS is the case of n-person prisonersdilemma. We further assume an infinitely repeated gameto depict the expectation by individual members that thesystem will evaluate infinitely many websites and exist untilan unforeseen future. If the system is known to last only fora finite amount of time, not-contributing at all game roundswill be the (sub-game perfect) equilibrium strategy.

    We consider that individual members rank their infinitepayoff stream using the -discounted average criterion. Thediscount factor i characterizes how a player weighs hispayoff in the current round compared to future payoffs.In the context of this paper, it can be interpreted as howa member perceives the long term importance of commonprotection and his relationship with the n 1 interactingpeers. We assume that i is heterogeneous. A short-sightedmember has i 0, while a player who values the long-termbenefit of the CSS system or who cares about the long-termrelationship with other members, has i 1. Let 0 i < 1,the -discounted average payoff of member i is given by:

    (1 i)t=1

    (i)t1Ut (2)

    Analyzing the equilibrium behaviors of the n > 2 commu-nity members is more complicated that a 2-player repeatedgame. A trivial setting is when all members are assumed toemploy a n-player grim trigger strategy which considerseach member to be contributing initially, but threatens tostop contributing forever if he realizes that any of his n peershas not contributed in the previous round. Given this largestthreat of punishment, an equilibrium whereby all memberswill always contribute can be achieved, if i:

    i >cn bbn b

    (3)

    A simple relationship between the cost c and benefit bcan now be established. If c is close to b, the requiredi approaches 1 as n increases. This reflects the real-lifescenario that user inputs are hard to obtain if the contributioncost is large relative to the benefit of protection.

    A challenge in a repeated n-player game is that one cannotidentify and punish those who have not contributed without acentralized monitoring mechanism, which can be expensiveto build or threaten the anonymity of contributors. A playercan only work out the contribution ratio of the others inthe previous round rt1 based on the payoff he receives.The n-player grim trigger strategy inefficiently punisheseveryone, making it to be an unrealistic strategy.

    III. THE EXPECTATION ON SOCIAL INFLUENCERather than using the grim trigger strategy, we adapt an

    idea from [7] such that the community members reason for

    their respective choice of action, not only depending on thepast but also on their expectation as to how their actions caninfluence the choice of others in the subsequent rounds. Weconstruct two simple influence rules, as follows:

    Linear influence. A player believes that his contributionwill have a positive influence on his peers and increasetheir contribution ratio by in the subsequent game rounds.Similarly, he expects that their contribution ratio will dropby if he does not contribute. The rule can be written as:

    r = min(max(rt1 + , 0), 1) (4)

    Sigmoid influence. Same as linear influence. However,the contribution ratio in the subsequent rounds is updatedfollowing a sigmoid curve. Specifically, a player reasons thathis action will have a reduced influence on others when thecurrent contribution level is close to the extremes, 0 or 1:

    r =1

    1 + ew(rt1+12 )

    (5)

    With the above expectation, a member i will contributeat time t, if:

    Vc(rt1)Vn(rt1)+i

    1 i[Vc(r) Vn(r)] > 0 (6)

    with Vc and Vn denoting the utilities of contributing andnot-contributing respectively, based on the last observedcontribution level rt1 and the expected future contributionratio r. This assumes that a member believes that if heis to contribute now, since his action will cause a positiveinfluence on the others, he will be better off to contributealso in the future. The same reasoning applies for the case ofnot-contributing. These simple rules (4), (5) and (6) modelthe bounded rationality of users. Indeed, it is non-trivial toreason for the best-responses in an interconnected structure.Each member is not aware of the discounting factor ofothers. One also may not know about his peers peers andhow they perform in their respective games.

    A. Simulation Results

    We consider several levels of expectation on how muchan action can influence of the action of others, as shownin Table I. Note that depicts the expectation thatcontributing/not has a positive/negative influence on peerscontribution ratio. The appendix s deno

Recommended

View more >