apes anonymity and privacy in electronic services · apes deliverable d2 – requirement study...

73
IWT Strategische Technologieën voor Welzijn en Welvaart APES Anonymity and Privacy in Electronic Services Deliverable 2 - Requirement study of different applications FINAL VERSION 11/05/2001 Stefaan Seys Claudia Diaz Bart De Win Vincent Naessens Caroline Goemans Joris Claessens Wim Moreau Prof. Bart De Decker Prof. Jos Dumortier Prof. Bart Preneel

Upload: others

Post on 10-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

IWTStrategische Technologieën voor Welzijn en Welvaart

APES

Anonymity and Privacy in Electronic Services

Deliverable 2 - Requirement study of different applications

FINAL VERSION11/05/2001

Stefaan SeysClaudia DiazBart De Win

Vincent NaessensCaroline Goemans

Joris ClaessensWim Moreau

Prof. Bart De DeckerProf. Jos DumortierProf. Bart Preneel

Page 2: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 2

Page 3: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 3

Executive summary

For applications such as electronic voting and electronic payments, anonymity andprivacy are strictly necessary. In a democratic society public elections will be heldanonymously and citizens have a fundamental right to privacy, for example when buyinggoods or subscribing to services. However, current technologies such as databases, on-line connections, mobile communications, may lead to an increased erosion of privacy.For the time being no widespread communications and payment technologies areavailable to provide on-line shopping without giving away a substantial amount ofpersonal information. Applications like email, publishing and web browsing are widelyaccepted without anonymity properties.

This document describes the anonymity requirements of a variety of applications inwhich anonymity and privacy play an important role: (1) anonymous connections, whichcan be used for all applications; (2) email; (3) web publishing; (4) web browsing; (5) on-line payments; (6) on-line elections and finally (7) on-line auctions.

In this document we provide a general model that can be used to describe the anonymityproperties of the these applications we studied. Firstly the notions of anonymity andprivacy are briefly set out. Secondly, an abstract and application independent terminologyis derived for the different entities that actively participate in the application. Finallydifferent types of anonymity (for example one-time anonymity and persistent anonymity)are described.

The remainder of the document describes in detail the anonymity requirements for theselected applications. For all these applications we start with a short overview of theirfunctionality and the different entities that participate in that application, together with amapping of these entities to the abstract roles described in the model. Secondly, theanonymity related requirements and properties of the application are described in moredetail.

Each application has its own specific requirements and, at first glance, a general solutionfor all applications seems unlikely. This provides the justification for providing theabstract model. If the entities in different applications map to the same abstract roles inthe model, it is likely that the solutions for these applications will be similar. Finally, ashort overview is presented of the existing solutions to provide anonymity and privacy tothese applications.

The last chapter contains a number of possible legal issues that will be further examinedin the deliverable on legal aspects of anonymity.

Page 4: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 4

Page 5: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 5

Table of Contents

EXECUTIVE SUMMARY __________________________________________ 3

1. APPLICATION OVERVIEW _____________________________________ 9

2. MODEL USED TO DESCRIBE ANONYMITY PROPERTIES __________ 102.1. Privacy - Anonymity - Identity - Pseudonymity. _________________________________ 10

2.1.1. Privacy_______________________________________________________________102.1.2. Anonymity ____________________________________________________________112.1.3. Identity ______________________________________________________________122.1.4. Pseudonymity__________________________________________________________12

2.2. Entities and roles __________________________________________________________ 132.2.1. Entities_______________________________________________________________132.2.2. Roles ________________________________________________________________14

2.3. Anonymity characteristics __________________________________________________ 152.3.1. Types________________________________________________________________152.3.2. Degrees ______________________________________________________________17

3. ANONYMOUS CONNECTIONS_________________________________ 193.1. Description of the system____________________________________________________ 19

3.2. Different entities in the system _______________________________________________ 203.2.1. Entities involved in the application __________________________________________203.2.2. Entities involved in the anonymous communication _____________________________203.2.3. Possible attackers _______________________________________________________20

3.3. Anonymity requirements/properties___________________________________________ 213.3.1. Requirements related to anonymity__________________________________________213.3.2. Other requirements______________________________________________________21

3.4. Short overview of existing solutions ___________________________________________ 21

4. ANONYMOUS E-MAIL________________________________________ 224.1. Description of the system____________________________________________________ 22

4.2. Different entities in the system _______________________________________________ 224.2.1. Entities involved in the communication ______________________________________224.2.2. Entities involved in the anonymous communication _____________________________234.2.3. Possible attackers _______________________________________________________23

4.3. Anonymity requirements/properties___________________________________________ 244.3.1. Properties unrelated to anonymity___________________________________________244.3.2. Requirements related to anonymity__________________________________________24

4.4. Short overview of existing solutions ___________________________________________ 254.4.1. Type 0 Remailer: Penet __________________________________________________254.4.2. Type 1 Remailer: CyberPunk ______________________________________________254.4.3. Type 2 Remailer: MixMaster ______________________________________________25

5. ANONYMOUS PUBLISHING___________________________________ 265.1. Description of the system____________________________________________________ 26

5.2. Different entities in the system _______________________________________________ 265.2.1. Entities involved in web publishing _________________________________________265.2.2. Entities involved in anonymous web publishing ________________________________26

Page 6: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 6

5.2.3. Possible attackers _______________________________________________________27

5.3. Anonymity requirements/properties___________________________________________ 275.3.1. Properties unrelated to anonymity___________________________________________275.3.2. Properties related to anonymity_____________________________________________28

5.4. Short overview of existing solutions ___________________________________________ 285.4.1. Eternity ______________________________________________________________285.4.2. Publius_______________________________________________________________295.4.3. TAZ Servers___________________________________________________________29

6. ANONYMOUS BROWSING____________________________________ 316.1. Description of the system____________________________________________________ 31

6.2. Different entities in the system _______________________________________________ 316.2.1. Entities involved in the web browsing________________________________________316.2.2. Entities involved in the anonymous web browsing ______________________________316.2.3. Possible attackers _______________________________________________________32

6.3. Anonymity requirements/properties___________________________________________ 326.3.1. Properties unrelated to anonymity___________________________________________326.3.2. Properties related to anonymity_____________________________________________33

6.4. Short overview of existing solutions ___________________________________________ 336.4.1. LPWA _______________________________________________________________336.4.2. Web Mixes____________________________________________________________336.4.3. CROWDS ____________________________________________________________34

7. ELECTRONIC PAYMENTS ____________________________________ 357.1. Description of the system____________________________________________________ 35

7.1.1. Introduction ___________________________________________________________357.1.2. Different steps in an electronic payment ______________________________________35

7.2. Different entities in the system _______________________________________________ 357.2.1. Different parties in any payment system ______________________________________357.2.2. Possible attackers _______________________________________________________36

7.3. Anonymity requirements/properties___________________________________________ 367.3.1. Anonymity related requirements: ___________________________________________36

7.4. Short overview of some existing solutions_______________________________________ 387.4.1. Blind signatures ________________________________________________________387.4.2. CAFE________________________________________________________________387.4.3. NetCash ______________________________________________________________397.4.4. Anonymous accounts ____________________________________________________397.4.5. Pseudonyms ___________________________________________________________407.4.6. Gemplus model ________________________________________________________407.4.7. Mix-based electronic payments ____________________________________________407.4.8. Proton system__________________________________________________________417.4.9. CEPS ________________________________________________________________417.4.10. Micropayments ________________________________________________________417.4.11. Summary _____________________________________________________________42

8. ELECTRONIC VOTING _______________________________________ 448.1. Description of the system____________________________________________________ 44

8.1.1. Introduction ___________________________________________________________448.1.2. Voting phases _________________________________________________________44

8.2. Different roles/entities in the system ___________________________________________ 448.2.1. Attackers _____________________________________________________________45

8.3. Anonymity requirements/properties___________________________________________ 45

Page 7: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 7

8.3.1. Anonymity related requirements____________________________________________458.3.2. Other requirements______________________________________________________458.3.3. Requirements of the implementation_________________________________________46

8.4. Short overview of existing systems ____________________________________________ 478.4.1. SafeVote _____________________________________________________________478.4.2. VoteHere _____________________________________________________________488.4.3. CyberVote ____________________________________________________________51

9. ELECTRONIC AUCTIONS_____________________________________ 539.1. Description of the system____________________________________________________ 53

9.1.1. Different auction properties _______________________________________________539.1.2. Auction types __________________________________________________________549.1.3. Different steps in an auction process. ________________________________________56

9.2. Different entities in the system _______________________________________________ 56

9.3. Anonymity requirements/properties___________________________________________ 579.3.1. Requirements unrelated to anonymity________________________________________579.3.2. Anonymity related requirements____________________________________________58

9.4. Short overview of existing solutions ___________________________________________ 59

10. LEGAL ISSUES ____________________________________________ 6110.1. Introduction ___________________________________________________________ 61

10.2. Possible legal issues______________________________________________________ 6110.2.1. General information requirement ___________________________________________6110.2.2. Specific transparency requirements for commercial communications ________________6210.2.3. Pre-contractual information requirement______________________________________6210.2.4. Liability of on-line service providers ________________________________________62

11. FUTURE WORK ____________________________________________ 64

Page 8: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 8

Page 9: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 9

1. Application overviewAs discussed in the project proposal, the first task and the goal of this deliverable is todefine a number of application areas and to analyze the state-of-the art in these areas. Inthis scope we have chosen to focus on applications that are popular and/or haveinteresting anonymity characteristics. The result of this selection phase is the followinglist of applications domains:

• Anonymous connections

• Anonymous e-mail

• Anonymous browsing

• Anonymous publishing

• Electronic payments

• Electronic voting

• Electronic auctions

In the rest of this document we will discuss each of these application domains by firstdescribing their basic functionality. Based on this description we will then elaborate onthe different settings by which anonymity is provided. For each of them we will alsobriefly discuss existing applications.

Other applications that could be interesting, but are currently not part of our researchinclude software agents, wireless communication, anonymous data-retrieval, applicationsin the mobility area, video-on-demand, etc. They could become the subject of futureresearch if we notice that they provide us with interesting case studies.

Page 10: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 10

2. Model used to describe anonymity propertiesIn order to describe and compare different applications, strict consensus on a commonterminology must be available. In this scope, this section will try to provide cleardefinitions for several important issues. After first briefly touching upon some generalremarks, we will list and define different entities/roles that could be involved in a typicalsystem. Afterwards, various anonymity properties will be described.

The title of this project is “Anonymity and Privacy in Electronic Services”, whichsuggests that anonymity and privacy are different concepts. Indeed, the former onlyconcentrates on hiding the identity of someone, while the latter is more extended. Privacyconcerns all information about the identity, which includes personal information (e.g., hisway of living), network traffic, content of this communication, other statistics, etc.Despite the fact that most of the anonymity services focus on anonymity, it is importantnot to neglect privacy concerns.

In the scope of this project it is also important to realize the distinction between anapplication and the extra service introduced to augment the anonymity/privacy propertiesof that application. Their dependency relation is strictly one-way: the application as suchremains functional without the anonymity service, be it less secure. However, ananonymity service without an application does not serve any purpose. In this report wewill often use the term application when we actually mean the combination of both.

2.1. Privacy - Anonymity - Identity - Pseudonymity.As stated in the introductory remarks, it seems necessary to focus on a number ofconcepts before tackling on application requirements for anonymity services. In thissubsection we will consider briefly the concepts of privacy, anonymity, identity andpseudonymity as explored at this stage of the research.

2.1.1. Privacy

The concept of privacy is a broad notion, open to interpretation as it is differentlyperceived depending on the individual and on the type of society. It also involvescompeting interests.

The Webster's New Collegiate Dictionary defines privacy as “the quality or state of beingapart from company or observation”. The Cambridge International Dictionary of Englishformulates privacy as “the freedom from unauthorized intrusion or the right to keeppersonal matters and relationships secret”. Consequently, privacy could be defined as“the interest that individuals have in sustaining a personal space, free from interferenceby other people and organizations” [47].

Elements of the above mentioned definitions are summarized in a frequently citeddefinition given by the American author Thomas Cooley and laid down in a legaldocument dd. 1879, in which he describes privacy as “the right to be left alone”.

Privacy is not only a broad notion, as made clear in the above mentioned definitions, butcan also appear under various dimensions, such as privacy of the person, privacy of

Page 11: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 11

personal behavior, privacy of personal communications and privacy of personal data [47]:

• privacy of the person refers to the integrity of the individual’s body;

• privacy of personal behavior relates to the individual’s behavior and particularly tosomeone’s socio-cultural attitudes in private and public places;

• privacy of personal communications relates to the individual's interest tocommunicate with each other, without routine control of their communications byother persons or organizations;

• privacy of personal data relates to the individual's interest to restrain other individualsand organizations from processing their personal data. They must at least be able toexercise a substantial degree of control over their data and use thereof.

“Privacy” in the electronic world

The term ‘information privacy’ is often used to refer to the combination ofcommunication privacy and data privacy, which is most relevant in the context of privacyin the electronic world [116].

The widespread practice of computer databases containing personal information and theemergent use of digital communications and transactions have created new problemsrelating to ‘information privacy’. The deficiency of adequate security facilities on the onehand and the impossibility to prevent unauthorised access to personal information on theother hand encourage privacy safeguarding techniques [47]. A possible technique is touse anonymity.

“Privacy” and “Anonymity”

As will be set out below, anonymity is closely related to the concept of identity [87]. Ananonymous person does not reveal his true or real identity. Identity is essential to eachperson who wants to control personal information and protect his privacy. Consequently,anonymity can be seen as a method of privacy protection.

2.1.2. Anonymity

As the concept of anonymity is generally speaking a phenomenon at the intersection ofmany disciplines [81] and analysed from a technical and legal perspective in this presentresearch project, we should find - at the present stage of the research - a workingdefinition at this stage of the research that can apply to the different approaches ofanonymity.

The Webster’s New Collegiate Dictionary describes anonymity as ‘the state of beinganonymous’. The adjective ‘anonymous’ is explained as ‘not named or identified’ or ‘ofunknown authorship or origin’ or ‘lacking individuality, distinction, or recognizability’.

Besides, anonymity is usually defined as the absence of identity [63]. However, we

Page 12: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 12

would rather opt to define anonymity as the absence of a true or real identity, given that afalse or fictitious identity is, strictly speaking, an identity but also a way to be anonymousas we will see when we define “pseudonymity”. In the light of the above-mentioned“negative” definition, namely the absence of a true or real identity, we should focus onwhat constitutes an “identity” [87].

2.1.3. Identity

“Identity” in the physical world

The lexical definition of ‘identity’ formulates the concept as the distinguishing characteror personality of an individual. As such, identity is more than a mere name. A specificidentity maps to a unique set of characteristics. In the physical world, these characteristicsmight include, among other things, the unchanging physical characteristics of the person,his preferences, or other people’s perceptions of the individual’s personality [55].

For centuries, two related but nevertheless distinct elements made up the identity of mostpeople. The physical element was based on their presence, on direct interaction in thephysical reality of daily life; the other element was informational, based on writtenrecords, which identify them as single individuals. Now, focusing on the Internet, newtechnologies introduce an additional type of identity: ‘digital identity’.

“Identity” in the electronic world

Digital identity can be defined as the digital information that creates the image of anindividually identifiable person. Digital identity is the means whereby data are associatedwith a digital persona, through the collection, storage and analysis of those data aboutthat person [48]. In the context of marketing, the “digital identity” stands for a set ofdigital information on attributes that represent a person, such as title, role, location, rightsand privileges to company resources, personnel information on salary, benefits, etc.

A digital identity can be traced through E-mail, passwords, credit card numbers, InternetProtocol addresses. However, the verification of the identity or process of authenticationasks for specific methods such as the digital signature.

2.1.4. Pseudonymity

“Pseudonymity” in the physical world

Pseudonymity is defined as the use of a pseudonym, being a fictitious distinguishingmark.

The New Fowler's Modern English Usage dictionary defines the use of a pseudonym asthe desire to remain anonymous.

It is important, with regard to the notion of anonymity, to distinguish pseudonyms thatcan be linked to identifiers such as the legal name and/or locatebility (pseudo-anonymity), from pseudonyms that can not be linked to other forms of identity

Page 13: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 13

knowledge (real or full anonymity).

“Pseudonymity” in the electronic world

The use of a pseudonym in the electronic world is put into practice by the use of so-called‘digital pseudonyms’. These are public keys used to verify signatures made by theanonymous holder of the corresponding private key. A “roster”, or list of pseudonyms, iscreated by an authority that decides which applications for pseudonyms it accepts, but isunable to trace the pseudonyms in the completed roster. The applications may be sent tothe authority anonymously, by untraceable mail, for example, or they may be provided insome other way [38].

“Pseudonymity” and “Anonymity”

It is obvious that pseudonymity is a way to be pseudo- or fully anonymous.

In the event of pseudo-anonymity, that pseudonym is an identifier for somebody to atransaction or a communication, at first doesn’t reveal one’s identity but that is indirectlysufficient to associate the transaction or the communication with the particular humanbeing who uses the fictitious name.

In case of real or full anonymity, the pseudonym cannot be linked to any form of identityknowledge at all.

The different types of anonymity and pseudonymity are further explained in section 2.3.1.

2.2. Entities and rolesFor this project, the most interesting applications involve different parties (otherwise itwould not make sense to hide the identity in an application). Nowadays, most of theseapplications are de facto distributed client/server applications, where a client requests aparticular service from a server and possibly receives the result of his request. Forinstance, web browsing involves a client with a browser sending requests to a web server.Or, in a mail system a client sends a mail message to a particular addressee. Although inthe latter case, the addressee is not really offering a service, both examples clearlyinvolve two parties that have different functionality or at least act differently.

2.2.1. Entities

To analyze the anonymity properties of an application, one must first identify thedifferent entities or parties that are involved in the execution model of the application.They are the most important threat to break the anonymity characteristics of the systemand are as such primary players in the risk analysis of the application. For a distributedclient/server system, these typically constitute of a client, a server and possibly someother underlying services. For a practical case, an enumeration of these entities isdependent on the specific application. Web browsing for instance involves a browser (theclient) and a web server (the server).

Page 14: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 14

Orthogonal to the distinction between different entities, some of them might cooperate inorder to acquire sensitive information. In this case, we denote them by the keywordcollaborating.

2.2.2. Roles

Not every entity in the execution model has the same knowledge about sensitiveinformation. For instance, a client in an anonymized browsing system will evidentlyalways know the identity of the client. However, the server or an eavesdropper might nothave this information. This knowledge depends on the particular task or responsibility ofthe entity in the application. We use the notion of roles to distinguish between differentresponsibilities.

In an application different roles cooperate to reach a common goal. Although thefunctionality of applications is often totally different, the roles involved are often verysimilar when viewed on an abstract level. To discuss those roles, we have divided theminto three parts: roles related to the functionality of the application, roles related to theanonymity service and others.

Application related rolesFrom a functional point of view, the following roles are part of an application providing aspecific service:

• initiator: the active part of the application initiating a request to a server

• recipient or responder: the addressee of the request. A recipient is a passive part ofthe application, while a responder reacts on the request by sending a reply to theinitiator.

• sender: any entity that is used to send communication in the application. Normally aninitiator will act as or use a sender to transfer his request to the recipient/responder.

• receiver: any entity used to receive communication, normally at the side of therecipient/responder.

Anonymity related rolesTo augment the level of anonymity, the following roles could be available in a system:

• anonymized initiator: to setup an anonymized system, the functionality of theinitiator might be modified or extended to incorporate the extra service. Therefore, weinclude the initiator role here again.

• anonymized recipient or responder: analogous

• anonymity provider: any entity that works as part of the anonymity service. Here, animportant distinction is based on their knowledge about the user:

• informed: the provider works as part of the anonymity service and thereby acquiresinformation that could compromise the identity of the user

• uninformed: the provider works as part of the anonymity service, but does not have

Page 15: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 15

such information

• trustee1: a known identity in the system, explicitly authorized to perform certaintasks. For a well-defined subset of these tasks, it is trusted to execute them correctly.This trust normally relies on a contractual agreement explicitly stating its liability incase of abuse.

Other roles• attacker: someone that tries to break the anonymity characteristics of the system, for

instance an eavesdropper. The attacker might have local or global capacities, meaningthat he can only listen to a part of or to all of the communication of the anonymizedapplication.

• software/hardware vendor/provider: he is responsible for writing the software ormaintaining the hardware. This role is often neglected, but unfortunately it forms aserious threat.

An entity can have different roles in an application. For example, he could be involved inthe communication of an e-mail system and at the same time contribute to the anonymityof the system. Hence, for a concrete application one must identify the entities involved inthe system and specify for each entity which roles he plays. Moreover, an applicationtypically consists of different phases. For example, electronic payment involves at least awithdrawal and a payment phase. In each of these phases an entity can (and probablywill) have different roles. Therefore, specifying entities and roles also requires specifyingthe respective phase(s) in the application.

2.3. Anonymity characteristicsBased on the proposed entities and their roles we can now define a taxonomy for thetypes and degrees of anonymity. Hereby, the degree quantifies how strong the anonymityproperty is enforced.

As a remark, anonymity properties are clearly dependent on the specific roles involved inthe application. Therefore, any statement about anonymity should always be related tothem. In general, the following expression should be used to exactly specify theanonymity characteristics:

role X is <type,degree> towards role Y.

2.3.1. Types

In order to comprehend the different anonymity types, five properties are important:

• identifiable: the possibility to know the real identity of some party in the system bymeans of actual data exchanged in the system

• linkable: the feasibility to link successive actions initiated by the same identity as a 1 In many systems, this role is also known as authority. However, since this might be confusing incombination with other legal terminology, we decided to rename this role.

Page 16: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 16

sequence

• traceable: the possibility to trace communication between application componentsand acquire as such privacy information. Where identifiable acquires identityinformation through the actual data communicated in the system, traceable focuses onthe context of the communication to get this information. 2 Besides the identity, thisalso concerns all privacy information.

• conditional: the possibility for a trustee to reveal the identity whenever required (e.g.when a judge asks for it).

• durable: denotes a quantification of the strength of the anonymity properties overtime (sometimes also referred to as forward secrecy)

The first three properties functionally identify the type of anonymity, while the last ratherdescribe orthogonal properties. Based on the former, we will now propose a taxonomy ofthe different types of anonymity.

No Identifiable Linkable Traceable Remarks1 √ √ √2 √ √ -3 √ - √ Contradictory4 √ - - Contradictory5 - √ √6 - √ -7 - - √ Contradictory8 - - -

Table 1: Anonymity property product matrix

In Table 1 we show all possible combinations of the first three properties. As you can see,three of them are contradictory, since identifiability implies linkability (3,4) andtraceability implies linkability (7)3. The others are useful combinations:

• 1: this combination provides no anonymity whatsoever. The identity involved in thesystem is known and communication can be traced. This is the weakest form ofanonymity.

• 2: this combination provides no anonymity. However, as opposed to (1)communication is not traceable in this case. This type of anonymity will often onlyapply to one (or some) specific role(s) in the system. For instance, in case of(confidential) anonymous connections, the exchanged data might contain identifyinginformation. As such, the addressee will know the identity, but any eavesdropper willnot and, because of the definition, he won’t be able to trace the communication.Therefore, the identity remains anonymous towards the latter.

2 To clarify this a bit further, one might notice that this distinction is comparable to the difference betweendata anonymity and connection anonymity.3 Intuitively, these two assumptions are correct. It is out of the scope of this document to prove them.

Page 17: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 17

• 5: semi-anonymity is a weak form of anonymity. In this case, the exchanged data inthe application does not reveal any identifying information (e.g. by removing thisinformation from the data or by encrypting it). However, communication is stilltraceable and hence the identity can be known, be it in a more difficult way. This typeof anonymity can be either conditional or unconditional.

• 6: persistent anonymity or pseudonymity uses a secure anonymous identity (oftencalled a pseudonym) to hide the real identity. Since one of the properties islinkability, the same pseudonym will be used over time. The pseudonyms can begenerated and/or managed by a trustee (conditional) or by the client himself(unconditional). The latter case is a good alternative since pseudonymity requiresuntraceable communication and therefore the identity of the source cannot berevealed. This can be achieved for instance by using anonymous connections(discussed in Section 3).

• 8: one-time anonymity also uses a pseudonym to hide the identity, but in this case anew pseudonym is used for every transaction. This combination is the strongest formof anonymity. Similar to pseudonymity, this can be achieved either conditional orunconditional.

Other useful types of anonymity include:

• reply anonymity: this type of anonymity enables replies to an anonymous sender.This could be used in combination with one-time or persistent anonymity. Note thatwe introduce a new type here instead of interchanging the roles involved in thetransaction. The reason for this is that we consider a transaction as a whole and hencereply communication is different from forward communication.

• destination anonymity: anonymity where the identity of the addressee is hidden.This could occur for instance in a multicast transaction where a message is sent to agroup of persons, without knowing the actual group members.

2.3.2. Degrees

To quantify the anonymity properties, we propose the following different degrees4:

• provably exposed: role Y cannot only identify role X, but can prove the identity ofrole X to others

• exposed: the identity of role X is completely disclosed to Y, but Y cannot prove it toothers

• possible innocence: role X is possibly innocent if, from role Y's point of view, thereis a nontrivial probability that the supposed identity of X is wrong

• probable innocence: role X is probably innocent if, from role Y's point of view, thesupposed identity of X appears no more likely to be the real identity or not. This isweaker than beyond suspicion in that role Y may have reason to expect that the

4 These definitions were already discussed in [182]

Page 18: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 18

supposed identity of X is more likely than any other potential identity, but it stillappears at least as likely that the supposed identity is wrong

• beyond suspicion: role X's anonymity is beyond suspicion if though role Y can seeevidence of a message, any identity of role X is no more likely than any otherpotential identity in the system

• absolute anonymity: role Y cannot see any evidence of role X communicating

Since the distinction between some of the terms is not always very clear, a strict(mathematical) model is definitely necessary. This will be one of the focus points forfuture work.

Page 19: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 19

3. Anonymous connections

3.1. Description of the systemFor the other applications discussed in this report, anonymity and privacy properties areenforced either by implementing them directly in the application or by using an extraservice specially built for that type of application. While this is an easy and efficientapproach, it is typically performed ad hoc and it requires redesigning for every new (typeof) application.

A different approach is based on the common factor for all distributed applications, i.e.communication. Anonymous connections provide a secure, untraceable communicationchannel, independent of the application. This is normally achieved by using a chain ofintermediate mix servers who’s input and output cannot be linked. Each mix forwards themessages to the next mix in the chain. The last mix in the chain forwards the request toserver. Because of the unlinkability property of the mixes, neither the server5 nor anyother external party will know the identity of the sender.

To enable communication to several servers, different chains must exist. Therefore,communication over an anonymous connection (i.e. a particular chain of mixes (seefigure)) first requires a connection setup phase by the client, after which he can send dataover this connection. At the end, the client or the server closes this connection.

Because of the application independent approach of anonymous connections, deployingthem in a concrete application often requires the implementation of a special adapter.However, besides this extra little effort, the rest can be reused for various types ofapplications, which is a major advantage over the other anonymity services.

5 Evidently, this is guaranteed only if the data transmitted over the connection doesn’t reveal the identity.

Client ServerAdapterAdapter

Mix

Mix

Mix

Mix

Mix

Page 20: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 20

3.2. Different entities in the system

3.2.1. Entities involved in the application

Anonymous connections are designed to be independent of the application. It is hencehard to discuss the entities involved in the application and their respective roles.However, since we focus our research on client/server applications, we can at leastdistinguish two entities. Others might be introduced for a concrete application.

- client application: depending on the type of application, this is the client part of thesystem, for example a web browser or an e-mail client.

- server application: analogously, this is for example a web server or another e-mailclient

3.2.2. Entities involved in the anonymous communication

- client part : as discussed before, anonymous connections are application independent.Deploying them for a concrete application frequently requires special adapters, whichform the actual link between the application and the connection. The main goal of theseadapters is to forward communication between the application and the connection and tohide application specific details (e.g., message format, ports, …). They are normallyexecuted on the same platform as the client application, or at least on a platform to whicha trusted communication channel exists. As an example, a client adapter for webbrowsing could be implemented as a local HTTP proxy receiving requests and usinganonymous connections to forward them to the web server. Evidently, the complexity ofthese adapters depends on the specific requirements of the application.

- server part : similar to the client part, the server application also requires a specialadapter to interact with the anonymous connection. The primary task of the adapter iscomparable to the client adapter. Evidently, since adapters directly communicate witheach other through the anonymous connection, they must understand each other andhence use some sort of protocol.

- chain of servers : as discussed previously, the anonymous connection is normallyimplemented by using a network of anonymity servers who’s input and output cannot belinked easily. In this service, they act as providers, possibly informed or uninformeddepending on (the implementation of) the particular system.

3.2.3. Possible attackers

A first group of attack entities are the attackers from outside : local and globaleavesdroppers can try to reveal the identities of the communicating parties.

A second group of attackers are those that are part of the anonymous connectionsystem. For instance, a mix server can try to reveal the identity. Depending on theparticular implementation of the anonymous connection system, he might or might notacquire this information. However, when some or all of the mix servers collaborate, theprobability of successfully discovering the identity will be higher.

Page 21: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 21

3.3. Anonymity requirements/properties

3.3.1. Requirements related to anonymity

The most important anonymity property for anonymous connections is the bi-directionaluntraceability of communication towards the attackers. This requirement depends onseveral other requirements, one of which is the unlinkability of the input/output of themix servers. Untraceability probably also requires encryption of messages over theconnection.

Other properties depend on the specific requirements of the application, possiblyincluding non-identifiability and unlinkability.

3.3.2. Other requirements

Various types of applications should be able to use the service. Because they havedifferent constraints, it is here particularly important that the system is generic andflexible enough to allow easy integration with the application. Moreover, the serviceshould not exclude certain types of applications by enforcing constraints that are hard tofollow. Therefore, it should allow communication to be possibly bi-directional, real-time , etc…

3.4. Short overview of existing solutionsMany systems offering anonymous connections are based on a building block introducedby Chaum [38], called a mix. The main purpose of a mix is to hide the correspondencebetween its input and output, for example by reordering the messages or by trafficpadding to make sure all messages have equal length. When communication is sentthrough a network of mixes, it is very hard to trace the exact path followed by thecommunication. Examples of such systems can be found in [1][62][103][104].

Onion Routing [91][160][192][194] was designed by the Naval Research Laboratory toprovide a flexible anonymous communication infrastructure. In this system, which isquite similar to mix-based systems, only the client knows the addressee. To startcommunicating, he sets up an anonymous connection by building and forwarding aspecial onion. This onion contains different layers describing a random path to theaddressee through various intermediate onion routers. By receiving this onion, eachrouter only knows the previous and the next router in the path. The connection phase endswhen the onion reaches its final destination, the addressee. As a result, neither theaddressee nor the intermediate onion routers know the identity of the client. Moreover,the intermediate routers do not know the identity of the addressee either. A first versionof this system was built and offered on the Internet for some time. Nowadays, thedesigners are working on a second version of the system.

Recently, a new protocol for initiator anonymity was proposed, called Hordes [182]. Forits anonymous connections, it uses a connection setup phase and forwarding mechanismssimilar to for instance Onion Routing. However, to send replies they use a mechanismbased on multicast and rely as such on its inherent anonymity properties. As a result,performance is gained over the other techniques.

Page 22: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 22

4. Anonymous e-mail

4.1. Description of the systemTwo phases appear in mail systems. In the first phase, a user sends an e-mail to anotheruser. In the second phase, the other user can reply to that e-mail. In anonymous mailsystems, the goal is that a user sends a message without revealing his identity. Moreover,the responder must be able to reply that user without knowing the identity of theoriginator of the message.

4.2. Different entities in the systemThis section is split up into three subsections. A first subsection handles the entities thatare part of the e-mail application (without any concern about anonymity aspects). Thesecond subsection discusses the entities that appear when we want to extend the e-mailapplication with anonymity and privacy aspects. A third subsection handles the entitiesthat can compromise the anonymous e-mail system.

4.2.1. Entities involved in the communication

In general, four entities are important in e-mail applications. Remark the similaritybetween what happens on the client side and what happens on the server side.

- Mail Client on client side : when a particular user types an e-mail message and pressesthe “send“ button, the Mail Client handles the message by converting it into the desiredformat and delivering it to the Mail Server on the client side. The converted messageconsists of the content of the message, the e-mail address of the sender, the e-mailaddress of the receiver, etc ... Thus, the Mail Client is the component of the e-mail systemthat is closely related to the physical sender of the message. In other words, the MailClient is the initiator of the message. When the recipient replies to that message, the MailClient to the client side becomes the recipient.

- Mail Server on client side : the Mail Server of the client is responsible for sending thee-mail into the network towards the Mail Server of the recipient of the message. ThisMail Server is the actual sender of the message. It’s also the receiver of replies.

- Mail Server on server side : this is the server where the e-mail arrives after it is sent outby the Mail Server of the client. It is in fact the recipient of the message. This MailServer holds the e-mail until the Mail Client of the user/server asks for it. That MailServer is also the sender of replies.

- Mail Client on server side : the user starts up this process when he wants to send e-mails. For instance, he can check his mail by asking which mails arrived at his MailServer. Eventually, he replies on a particular mail. When the user just reads his mail, theMail Client on the server side remains passive: he is the recipient of the messages. Whenthe user wants to reply to a particular message, the Mail Client behaves as responder tothe message. Then, he is the initiator of the reply.

Page 23: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 23

4.2.2. Entities involved in the anonymous communication

- Mail Client on client side : as discussed in the previous paragraph, the Mail Clientinitiates the message to the Mail Server. In an anonymous mail environment, the MailClient functionality can be extended to provide some degree of anonymity and privacy.In this way, the Mail Client also has anonymity functionality. The goal of the anonymityservice provided is that the initiator remains anonymous towards the receiver and that theinitiator and the recipient cannot be linked during the transfer.

- Central remailer: in an anonymous mail system, the e-mail can be redirected to acentral remailer that provides some anonymity functionality to the e-mail and providesenough information to forward the e-mail to the Mail Server on the server side. Thecentral remailer has enough information to discover the initiator and the recipient of themessage: it is an informed provider.

- Chain of remailers : another option is to bring the e-mail into a chain of remailers.Each remailer in the chain forwards the message to the next remailer in the chain. Theinitiator of the message is revealed at the first remailer; the recipient of the e-mail isrevealed at the last remailer in the chain. In that way, the initiator and the recipient of themessage cannot be linked to each other at a particular remailer in the chain. In thisscheme, each remailer can be seen as an uninformed provider. This is similar to theonion routing model.

Remark that not all entities involved in the communication have some anonymityfunctionality. This is normal because it cannot be claimed that an initiator of ananonymous message has a mail server with anonymity functionalities. Moreover, itcannot be claimed that each recipient of anonymous mail has a Mail Server and MailClient that possesses anonymity functionalities.

4.2.3. Possible attackers

The possible attackers can be split up into two groups:

- A first group of attack entities are the attackers from outside : local and globaleavesdroppers can try to reveal the identities of the communicating parties.

- A second group of attackers are those that are part of the anonymous mail system: forinstance, a central remailer can be compromised if it reveals the anonymity information itpossesses. The central remailer can sell this information to people interested in thisinformation. When adapting the scheme of a chain of remailers, we can say that eachremailer in the chain does not have enough information to link the initiator and recipientof the message. But this is not true if all the remailers in the chain are collaborating. Ifthey exchange their information, the whole anonymous mail system can also becompromised.

Note that this second group maybe has the capability to reveal more anonymityinformation because they provide some anonymity functionality.

Page 24: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 24

4.3. Anonymity requirements/properties

4.3.1. Properties unrelated to anonymity

Bi-directional: one important requirement of e-mail applications is that a recipient of amessage must be able to reply to the message. This has its implications for anonymousmail systems. The initiator of the original message includes enough information so thatthe responder can reply without knowing information about the initiator’s identity.

Not real-time : e-mail is not a real time application. Anonymous mail systems canbenefit from this property to provide anonymity services. For instance, a remailer canhold a message for some time before forwarding it so that eavesdroppers cannot linkincoming messages with outcoming messages. Global eavesdroppers cannot follow thepath from initiator towards recipient.

4.3.2. Requirements related to anonymity

One time anonymity:

- From initiator towards recipient: if an initiator composes a message, he wants to remainanonymous with respect to the recipient. For example, a patient that consults a doctordoes not want to reveal its identity when he asks a question to his doctor.

- From initiator towards attackers: attackers must not be able to reveal the identity of theinitiator of a message. Moreover, an attacker may not be able to link the sender to therecipient. Let’s take back the example of the doctor : if an attacker can link the initiatorwith the doctor, the attacker could reveal that something is wrong with the initiator.

Note that an attacker can be an inside or outside attacker.

Persistent anonymity:

- The roles between which persistent anonymity can be applied are the same as for onetime anonymity. But here, the initiator uses the same pseudonym for each message. So,persistent anonymity can be useful from the initiator towards the recipient and from theinitiator towards attackers. A recipient that answers e-mails about its disease wants to becontacted anonymously. In that case, the recipient uses a pseudonym. One timeanonymity is not applicable because there must be a way to be contacted permanently. Apatient wants to have the same pseudonym during subsequent consultations.

Reply anonymity:

- From initiator towards recipient and attackers: the responder must be able to reply to ane-mail in a way that the initiator remains anonymous. Therefore, the initiator mustprovide extra information so that the responder can reply in an anonymous way.

Page 25: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 25

4.4. Short overview of existing solutions

4.4.1. Type 0 Remailer: Penet

In a type 0 Remailer System [214][202], the initiator forwards the message to the centralremailer. This remailer erases the identity of the initiator and replaces it with a one-timeidentity or a pseudonym. The remailer then forwards the message to the recipient. Alocal eavesdropper can provably expose the sender when intercepting the message beforeit arrives at the remailer. Moreover, a successful attack to the central remailercompromises the whole system.

4.4.2. Type 1 Remailer: CyberPunk

The initiator of an e-mail builds a nested set of encrypted messages[53]. At each layer,the encrypted message consists of a set of instructions and another encrypted message.Each remailer in the chain first decrypts the message, executes the instructions andforwards it to the next remailer in the chain. The next remailer in the chain is knownafter decryption of the received message. Because each remailer removes only one layerof encryption the initiator and recipient are unlinkable from the point of view of a localeavesdropper or a particular remailer in the chain. This system does not provideprotection against global eavesdroppers.

4.4.3. Type 2 Remailer: MixMaster

Type 2 remailer systems [53] principally consist of the same building components asType 1 remailer systems. Moreover, they provide techniques to counter attacks by globaleavesdroppers. These techniques are message reordering at each hop, building messagesof a fixed number of bytes and denial of replays.

M.C. M.C.M.S.M.S.

Rem.

Rem.

Rem.

Rem.

Rem.

M.C. M.C.M.S.M.S.

Penet:central

remailer

Page 26: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 26

5. Anonymous publishing

5.1. Description of the systemAn anonymous publishing system provides the possibility for a user to publishanonymous documents. This means that the author of the publication remainsanonymous towards another entity in the network. Two phases can be distinguished in aweb publication system. In the first phase, the publisher puts the document on one ormore web servers. In the second phase, users ask for the web publication by sending arequest to the server.

Anonymous publishing systems can be divided in two groups:

- A first group of systems provides the anonymity service at the publishing phase. Thepublisher puts his document to on one or more web servers without revealing his identity.

- A second group of systems provides the anonymity service at the web browsing phase.The web server where the document is published remains anonymous at retrieval time.The initiator may not be able to derive information about the location of the documentfrom the url that he must type to get the right document. Otherwise, the location of thepublication is not anonymous towards the initiator of the request.

5.2. Different entities in the system

5.2.1. Entities involved in web publishing

- Web publisher: a web publisher puts a document online on one or more servers. Theweb publisher is the initiator of the first phase in a web publishing system.

- Client web browse application: when a particular user wants to consult an onlinedocument, he types the according url in his web browser. The web browser handles therequest by contacting the web server from which the document can be downloaded. As aresult of the request, the client web browser receives the demanded document and showsit on the screen. As such, the client web browser is both the initiator and the sender of therequest.

- Web server(s): this is/are the web server(s) where the document is published. The webservers are the receivers and recipients of a web publication request. Moreover the webservers are both the receiver and the responder of a web browse request. Upon request,the web server returns the desired document.

5.2.2. Entities involved in anonymous web publishing

- Publisher: when the publisher puts his documents on another web server, he wants toremain anonymous during this phase. Therefore, the publisher must transform his request

Page 27: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 27

to an anonymous one. For instance, he can send the document included in an anonymouse-mail message to the publishing web server(s).

- Location provider: a location provider publishes “nested encrypted url’s” together witha specification of what documents can be requested from that “nested encrypted url’s”.These url’s contain enough information to contact the right web server but in such aformat that it is impossible for the initiator to derive the web server from the information.Instead of contacting the web server immediately, the url is send to a rewebber in therewebber network. Because the location provider is nothing more than a public database,it is an uninformed provider.

- Chain of rewebbers : the published nested encrypted url contains the first rewebber ina chain of rewebbers. The request is send to this rewebber. This rewebber peels off onelayer of the nested encrypted url so that the second rewebber in the chain appears. Therequest is send to the second rewebber. This process continues until the request arrives atthe last rewebber in the chain, which forwards the request to the web server. So, the lastrewebber in the network knows the location of the web server but does not know theoriginal request. Thus, the rewebbers in the network are uninformed providers.

5.2.3. Possible attackers

Again, we split up the attackers in two groups: the attackers from outside the webpublication system and the attackers from inside the web publication system.

- Global eavesdroppers form a thread because they can follow the chain of rewebberswhere the request passes. So, global eavesdroppers can possibly trace the request fromthe initiator to the web server. In practice however, this is very hard to achieve. Anotherglobal attack is possible. Strong search engines retrieve information about all documentspublished all over network. To counter this attack, an anonymous published documentmust be stored at the web server in a modified (i.e. encrypted) form. The documentappears in plaintext only at the initiator of the request. In that way, search engines do notknow anything about the content of the document at the web server.

- Collaborating rewebbers: if all rewebbers in the chain collaborate, they can trace thepath from the initiator of the request towards the web server where the document ispublished. In that way, collaborating rewebbers can compromise the system.

Remark that a location provider doesn’t have any security sensitive information an thuscannot form a thread to the system. Because all information is publicly available, thelocation provider cannot be compromised.

5.3. Anonymity requirements/properties

5.3.1. Properties unrelated to anonymity

- Bi-directional: web browse requests are bi-directional because a request is sent from aweb browser to a web server and that web server responds to that particular request. Theresponse can occur along the same connection as the original request, because the

Page 28: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 28

response happens immediately after the request. So, the web server does not have topossess enough information to contact the initiator afterwards (i.e. the web server doesnot have to make an anonymous connection to the initiator). This is not true for mailsystems where the responder must have enough information about the initiator of themessage to reply to that initiator.

- Real time : in contrast to mail systems, web browse requests are handled in real time.The response happens immediately after the request. So, rewebbers in the networkcannot use latency as a security mechanism.

5.3.2. Properties related to anonymity

Persistent anonymity:

- From receiver towards initiator/attackers: anonymous web publishers put a nestedencrypted url on a location provider. This url is in fact a permanent anonymous addressof the web server where the document is published. Every initiator that consults thedocument uses the same url (i.e. the url is persistent). This way, the web server (and thedocument) remains anonymous towards all other entities in the system.

5.4. Short overview of existing solutionsAs discussed before, web publishing consists of two phases. First, the publisher sets hisdocument on a web server. Second, some users consult the document. Systems thatprovide some anonymity service can be split up in two categories. The first categoryoffers mechanisms providing some anonymity service during the publishing phase (f.i.Eternity and Publius. The second category offers mechanisms during the web browsingphase (f.i. TAZ servers).

5.4.1. Eternity

In this system [2], the anonymity lays in the publishing phase. The publisher sends the

publisherbrowser

server

server

server

eternity

server

publishing phaseweb browsing phase

Page 29: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 29

document to the Eternity server in an anonymous way. The Eternity system replicates thedocument on a series of servers so that it is practically impossible to remove it from allthese servers. Users requesting the posted document cannot retrieve the identity of thepublisher. Because of the redundancy, the publisher is nearly sure that the document isavailable and remains online.

5.4.2. Publius

Publius [197] has the same fundamentals. They also provide extra mechanisms such thatit is not possible for outsiders to update the document. Only the publisher can do so. Thepublisher is responsible for replicating the document and dividing the keys along theservers needed to decrypt the document.

5.4.3. TAZ Servers

In this system [89], the publisher puts the document on one web server. This may be itsown web server or not. The anonymity is not in the publishing phase but in the webbrowsing phase. The TAZ system consists of a TAZ server and a network of rewebbers.The TAZ server provides a public database that maps virtual hostnames into nestedencrypted URL's. Thus, TAZ servers do not increase the security of the system. Next tovirtual hostnames, pseudonyms (for authors) can be added. An initiator sends the nested

serverbrowser

rew.

rew.

rew.

rew.

rew.

publisherbrowser

server

server

server

server

serverpublishing phaseweb browsing phase

web browsing phase

Page 30: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 30

encrypted url through a chain of rewebbers. Each intermediate rewebber peals off onelayer of the nested encrypted URL. The document itself is also encrypted at the webserver to prevent a search engine attack.

Page 31: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 31

6. Anonymous browsing

6.1. Description of the systemThe goal of this type of system is to provide to a user an environment for anonymous webbrowsing. In contrast to anonymous web publishing, the intention of this type ofapplication is that the initiator of a web browse request remains anonymous.

6.2. Different entities in the system

6.2.1. Entities involved in the web browsing

The entities involved in a web browsing system are similar to those in the previoussection (about ‘entities involved in anonymous publication’). In fact, the section aboutanonymous web publication retrieval also uses a web browser to request a publicationand a web server where the publication is available. So, we distinguish two entities:

- Client web browse application: the initiator of the request.

- Web server: the receiver of the request.

6.2.2. Entities involved in the anonymous web browsing

- Client web browse application: to provide an anonymous web browse system, theclient web browser must be extended with some extra anonymity functionality. Insteadof sending a request to the web server immediately, the request is sent to (an)otherserver(s) so that the client is hidden from the receiver of the request.

- A central server: in an anonymous web browse system, a central server can provide ananonymity service. The web browse request is sent to the central server. This serverscratches the initiator’s identity and forwards the request to the receiver of the request.The receiver answers the request to the central server. In this solution, a lot of parallelismis detected with the system of one central remailer in the anonymous mail systems. Thecentral server is an informed provider because it knows everything about the identity ofthe initiator of the request.

- Chain of mixes: instead of sending the request to one central server, a chain of mixescan be used through which the request is sent. A mix in the chain only knows the nextmix in the chain to contact. As such, no mix in the chain can discover the full pathbetween the initiator of the request and the receiver of the request. The first mix in thechain knows the initiator and the identity of the next mix but nothing about the receiverof the request. Analogously, the last mix in the chain knows the previous mix in thechain and the receiver of the request but nothing about the initiator of the request. So, theinitiator and the receiver cannot be linked if the mixes do not collaborate. Thus, themixes in the chain are uninformed providers. This system has some similarities with the“chain of remailers” and the “chain of rewebbers”.

Page 32: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 32

- Group of participants: instead of sending a web browse request through a chain ofmixes, a request can be sent to another initiator of other web browse requests. Thisprocess can be repeated so that the request passes a chain of randomly chosen initiatorsbefore the request is actually forwarded to the web server. In that solution, the firstinitiator in the chain is the real initiator of the request. But the second initiator in thechain does not know if the previous initiator is the real initiator or not, etc. The webserver can only guess that the request came from the last one in the chain. It does notknow the real initiator of the request. All initiators register at an authority server.

- An authority server (= trustee): the system of a chain of initiators only works good ifno compromised initiator register. Therefore, the registration server acts as an authorityserver. This type of server is actually uninformed because he does not know anythingabout the actual web browse requests. He only accepts or rejects a new user and informsthe other users in case of acceptation.

6.2.3. Possible attackers

Again, the possible attackers are split up in entities that are found outside the system andentities that are found inside the system.

- Outside attackers : when using one central server, a local eavesdropper can try to attackthe security of that central server. Global eavesdroppers are needed to attack the otherentities in an anonymous web browsing system. As said before, this is very difficult toachieve.

- Inside attackers : every entity that provides some anonymity functionality (except theclient web browse application) has the possibility to attack the system in some way.However, the possibility that it really attacks the system differs from entity to entity. Onecentral server can compromise the system. However, if this central server is owned by arecognized instance (f.i. the government), it can be assumed that the instance will nothave the intention to compromise the system. A chain of collaborating mixes/initiatorscan also attack the anonymous web browse system in the same way that a chain ofremailers and a chain of rewebbers can compromise the system. However, there is adifference between the mixes and the initiators. Mixes are machines that are part of asystem during an unlimited time. If they are placed by a recognized instance, it is littleprobable that mixes really collaborate. But this is not true for the chain of initiators.Everyone can enter the group of initiators. That is why there is a need to have some kindof authority server in that scheme.

Note that an authority server can also form a thread when it creates and acceptscompromised initiators.

6.3. Anonymity requirements/properties

6.3.1. Properties unrelated to anonymity

The requirements of a web browse application without anonymity included, are described

Page 33: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 33

in the section of web publication retrieval: real time and bi-directional. Remark thatmechanisms used to implement anonymity services must take into account with theserequirements.

6.3.2. Properties related to anonymity

One time anonymity:

- From initiator towards receiver/attackers: the intention of an anonymous web browsesystem is that the initiator remains anonymous during the web browsing process. Theinitiator may not be identifiable and may not be linked towards the web server that isconsulted.

Persistent anonymity:

- From initiator towards receiver/attackers: if the initiator wants to have the samepseudonym during several web browse sessions, persistent anonymity is required.

Note that these types of anonymity have some parallelism with anonymous mail systems.In anonymous mail systems, the intention is also that the initiator of a mail remainsanonymous towards all other parties (recipient and attackers). Reply anonymity is aproperty of an anonymous web browse system in contrast to anonymous mail systems.The answer to the web browse request is given along the same connection as the requestitself. The web server does not have to know how to reply an initiator in the future.

6.4. Short overview of existing solutions

6.4.1. LPWA

The LPWA web browse system [213] consists of one central server that maps usernames, passwords and e-mail addresses to corresponding pseudonyms. Thus, the receiverof a request does not know the identity of the initiator. A local eavesdropper between theinitiator and the LPWA server can link the initiator and the receiver of a web browserequest. Two variants try to tackle this problem: LPWA proxies and LPWA serversbehind a firewall.

6.4.2. Web Mixes

Web mixes [211] consist of a chain of mixes through which a request is sent. Theybasically use the same technique as onion routing. Each web mix peels off one layer in

serverbrowserLPWAserver

Page 34: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 34

the request. However, this system is built in at application level in contrast to onionrouting.

6.4.3. CROWDS

Crowds [217] consists of a series of jondo's that both act as the initiator of a request andas a participant of the group, called crowd. An authority server adds candidates to thecrowd or rejects them. It also informs the other crowd members of an added/rejectedcrowd member. A jondo sends a request to a random crowd member (i.e. a participant)that decides either to submit the request to the receiver or forwards the request to anothercrowd member. Conclusion: in a first phase, a jondo asks the authentication server to addthe crowd; after that, the jondo can play the role of initiator or intermediate jondo duringweb browse requests.

serverbrowser

mix

mix

mix

mix

mix

serverjondo

jondo

jondo

jondo

jondo

jondo

Authorityauthorization phaseweb browsing phase

Page 35: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 35

7. Electronic payments

7.1. Description of the system

7.1.1. Introduction

Electronic commerce is becoming more and more important with the generalization ofthe use of the Internet to purchase goods. Anonymity properties must be implemented inthese systems to protect users' privacy, just as it is in the real world, as well as means toprevent fraud.

In an electronic payment we can identify a buyer (payer), a seller (payee) and a bank thatallows the money transfer from the buyer to the seller to pay the purchased goods.

The anonymity properties discussed in this chapter are required at the application level.Some protection is needed at the communication level to construct a robust system.

7.1.2. Different steps in an electronic payment

An electronic payment can be divided into three basic phases as follows:

• Withdrawal: the payer contacts the bank to take money out of his account. The payermay receive the money as electronic coins or transfer money to another account.

• Payment : the payer contacts the payee and they agree on the exchange of money andgoods or services. In case the system uses electronic coins, the payer sends the coinsto the payee; other systems make transactions between different accounts.

• Deposit: the payee contacts the bank to transfer the coins to his account (or to make atransaction between two accounts, being the destination account not anonymous).

Some protocols include a registration phase if certificates are being used. In some casesthese phases are divided into different steps.

Note that the anonymity properties we want to achieve will be effective if no entity isable to link a particular withdrawal to a payment. Therefore, we have to focus on thewithdrawal and payment phases.

7.2. Different entities in the system

7.2.1. Different parties in any payment system

• The payer, that maps to the initiator in the definitions of the proposed model. It is theuser of the system who wants to buy certain products or use particular servicesoffered by the payee.

• The payee, that maps to the responder in the model. It is the entity offering a serviceusing the facilities of the network and the payment system.

• The bank, that is a provider. The bank (or a set of banks) controls the accounts of the

Page 36: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 36

payer and the payee, the transactions that take place between the payer and the payeeare made through the bank. It is desirable that the bank is an uninformed provider,because the anonymity of the payer must be protected towards the bank.

The described roles are suited for the payment phase. Note that during withdrawal anddeposit the payee and the bank will take different roles (the bank behaves as responder inthese other two phases, and the payee is the initiator during deposit).

Some payment systems involve more parties, making use of trusted parties who have theability to disclose information about the identity of the payer under special circumstances.The details will be discussed in the description of the particular system, as a generalremark, these other entities of the system are (or constitute as a set) a trustee, which is aparty that has the information needed to link the payer to the payment. It is important tonotice that, for security requirements, this trustee must be capable only of anonymityrevocation, but that he cannot play any other role in the system. In particular the trusteemust be unable to forge money.

There are different approaches to achieve the anonymity revocation by the trustee [26]:

• The trustee is involved in every withdrawal. In this case the trustee performs theblinding operation on behalf of the payer.

• The trustee is involved in the opening of accounts. Such a system is more efficientthan the previous one, because normally an account is used for more than a singletransaction.

• The trustee is not involved in any protocols of the payment system but is needed onlyfor anonymity revocation. In such systems the customer proves the bank in thewithdrawal protocol that the coin and the exchanged messages contain information,encrypted with the public key of a trustee that allow anonymity revocation.

7.2.2. Possible attackers

The attackers can be split in two groups, outsiders and insiders.

• Outsiders will be called eavesdroppers, and they will listen to communication. Theseeavesdroppers can be local at any point or global.

• Insiders are entities that are part of the system and try to use the information orprivileges they have to obtain more information. It can be a malicious payer, amalicious payee, or a compromised provider of any kind (bank, trustee, certificationauthority, or any other entity that has information about the payment).

7.3. Anonymity requirements/properties

7.3.1. Anonymity related requirements:

• Payer anonymity: the identity of the payer should remain anonymous, the payermust not be identifiable.

• Payer untraceability: Inability to trace a payment back to the payer.

Page 37: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 37

• Payment untraceability: Inability to trace a payer to obtain information about thepayments he has performed.

• Payment unlinkability: Inability to link different payments, as made by the samepayer.

The three anonymity types we describe should be durable, unless some anonymitycontrol is implemented (by means of anonymity revocation) to prevent fraud.

The payer anonymity must be achieved towards the payee (provided by most systems),towards an eavesdropper, towards the bank (many implemented systems do not provideit) and towards any other party involved in the designed system.

The same applies to the described properties of unlinkability and untraceability (toachieve untraceability anonymous connections should be used). In some systems payeranonymity, payer untraceability and payment untraceability are provided towards thepayee, but not payment unlinkability. For each implemented system we will discuss theanonymity properties.

Another interesting issue is the revocability of the anonymity. Electronic payments are anattractive field for fraud. Implementing mechanisms to protect the anonymity of honestusers should not give facilities to malicious users to carry out fraud while their identity isprotected. Therefore, we need mechanisms to prevent fraud or, at least, to be able todetect it and trace the identity of the dishonest users, this should be achieved withoutaffecting the anonymity of honest users. It is understood that the trustee(s) is the onlyentity that can perform revocation, and it will only answer a request if there existssufficient evidence that a transaction is not lawful [26][45].

There are two tracing mechanisms that can be used to provide anonymity control [61]:

• Owner tracing: allows the trustees to disclose the identity of the owner of a specificcoin (revoking payer untraceability). When a coin has been double spent (or used insome other fraudulent way) with an owner tracing protocol it would be possible tofind out the owner of the coin. This protocol can only be applied 'after the fact', thatis, after the crime has been committed.

• Coin tracing: traces the coin(s) originated from a suspicious withdrawal (revokingpayment untraceability). With this mechanism trustees can trace the destination ofcertain coins. This might be useful to prevent blackmailing, illegal good selling, etc.With coin tracing trustees can trace activities of a suspect user, and trace the coinsbefore they have been spent.

Some requirements for the control of anonymity in e-cash systems are the following [61]:

• Anonymity for legitimate users. The four anonymity requirements previouslymentioned should be provided for honest users.

• Revocation upon warrant presentation. In order to prevent fraud, anonymity should berevocable in certain cases, for example with a judge's order. In this case, a trustedparty, or a combination of parties should be able to perform owner tracing or cointracing.

• Separation of power. The trustee(s) which have the ability to revoke anonymity

Page 38: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 38

should not have any power other than tracing; in particular, they should not be able toforge coins or impersonate users.

• No framing. The bank, even in collaboration with the trustee(s) or other parties(payee, etc.), should not be able to impersonate users.

• Selectivity. Revocation must be selective; that is, only the transaction for which ajudge's order is given must be de-anonymized. The system must behave as a fullytraceable system with respect to this transaction, but remain anonymous for the rest -even for transactions of the same user.

• Efficiency. The added burden for the basic system should be minimal for all involvedparties. In particular, trustees must be involved only when revocation is required, andremain off-line otherwise.

• Crime prevention. Anonymity revocation should not (even indirectly) motivatecrimes more serious than those it protects against.

7.4. Short overview of some existing solutions

7.4.1. Blind signatures

This technique makes anonymous electronic payments possible as follows: during thewithdrawal phase, the payer generates the coins, blinds them with a commuting functiononly known by himself, and asks the bank to sign them [31]. Later on, the payer appliesthe inverse of the commuting function, and gets valid coins signed by the bank. The bankdoes not know the number of the coins he has signed, thus, he cannot link the payer to thecoins.

To carry out the payment, the payee will ask the bank to validate the signature of thecoins, to prevent fraud and double spending. Auditability can be achieved if payersreceive a digital receipt from payees.

This scheme provides payer anonymity, payment untraceability, payer untraceabilityand payment unlinkability towards all parties. In case some coins are stolen the payermay give the bank information about the coins, so the coins can be traced if they havebeen spent. If the coins have not yet been spent the bank can refuse them as valid coinswhen asked to validate the signature.

There are some variants that allow the bank to be off-line during the payment, and stillprevent double spending. The schemes are called one-show blind signatures [20],restrictive blind signatures [21] or fair blind signatures [186].

Attacks: if the payer does not use anonymous connections an eavesdropper, the payee orthe bank may disclose his identity or link together withdrawal and payment or differentpayments.

7.4.2. CAFE

This is an off-line payment system. It is based on the blind signature technique, but herethe coins contain information about the identity of the payer, that is only revealed in case

Page 39: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 39

of double spending [18]. The system lies in the payer's responsibility for its ownanonymity.

Under normal circumstances (honest payer), the system provides payer anonymity,payment untraceability, payer untraceability and payment unlinkability.

Attacks: if the payer does not use anonymous connections an eavesdropper, the payee orthe bank may disclose his identity or link together withdrawal and payment or differentpayments.

7.4.3. NetCash

In this system [129][136] there are three parties, Currency Servers, merchants (includingbanks and payees) and clients. The 'Currency Server' knows the serial number of thecoins, and the identity of the payer during the withdrawal phase. In this scheme this entityis an informed provider, according to the definitions of the APES model. The systemprovides payment unlinkability, payer anonymity, payer untraceability andpayment untraceability towards the payee and the bank.

The Currency Server is expected not to keep information about the coins. An honestCurrency Server will not be able to disclose the identity of the payer, nor to linkpayments or perform payer or payment traceability.

Attacks: if the payer does not use anonymous connections an eavesdropper or the payeemay disclose his identity or link together withdrawal and payment or different payments.Information about the payer may also be disclosed by a corrupted or dishonest CurrencyServer.

7.4.4. Anonymous accounts

Anonymous accounts are created to transfer money anonymously. The payer withdrawsmoney from his account using blind signature techniques, and deposits the money in ananonymous account. This anonymous account is used to transfer the money to the payee.

There is a trustee in the system that, in collaboration with the bank, can perform owner orcoin tracing. The trustee knows the correspondence between the personal and theanonymous account.

The bank cannot link the withdrawal with the deposit in the anonymous account, so thesystem provides payer anonymity, payment untraceability and payer untraceabilitytowards the bank and the payee. Payment unlinkability is also provided, and differentpayments cannot be linked together if the payer uses a new anonymous account foreach payment.

Anonymity control is provided by the trustee, it can revoke payer anonymity, payeruntraceability and payment untraceability in collaboration with the bank.

Attacks: IP address eavesdropping if no anonymous connections are being used.Dishonest or corrupted collaborating trustee and bank

Page 40: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 40

7.4.5. Pseudonyms

In this system there is another party involved, a trusted authority (informed third party),that keeps information that relates the identity of a user to his pseudonym. The user useshis pseudonym to perform transactions. The method provides payer anonymity towardsthe payee and the bank, but not towards the trusted authority.

By using a permanent pseudonym it is possible for the bank to link togetherdifferent payments.

Attacks: Corrupted or attacked trustee. IP address eavesdropping if no anonymousconnections are being used.

7.4.6. Gemplus model

The system [132] is based on blind signatures. In this model blinding is sub-contracted toa trustee, using identity-linked pseudonyms (PIDs) to achieve anonymity.

The bank acts as a Certification Authority (CA) that provides the user the certified PIDs(therefore the bank can link the pseudonym to the real identity of the user). The trustee isdesigned as a Blinding Office (BO), it can link all payments made under the same PID.The user may use the PIDs to pseudo-identify himself to the BO.

Anonymity is achieved by transferring the capability to link a certain coin and the user IDfrom CA to BO.

The implemented anonymity is revocable by collaborating entities (bank and trustee),therefore the privacy of the payer is vulnerable towards collaborating entities.

Attacks: CA and BO may impersonate the user. Collaborating CA and BO may disclosethe identity of the user, and trace the payments. Corrupted or attacked BO may releaseinformation about payments made under certain PID.

7.4.7. Mix-based electronic payments

The systems described previously provide data anonymity, but they do not take intoaccount the connection used for communication. If payments are carried out over theInternet, it is possible that an eavesdropper can trace the operations watching the IPaddresses, disclosing information about the identity of the user. This scheme providesprivacy using a mix-network, instead of blind signatures, protecting the identity of theuser towards eavesdroppers.

The system [102] is smart card oriented, and uses an account-based model. Theanonymity it implements is revocable if different entities collaborate.

Besides the payer, payee and the bank, there are two more entities in this system: thecertification authority and the transaction center (informed provider). The transactioncenter processes transfers between accounts of payers and payees, it knows for eachpayer's public key the identity of the payer's bank, it is controlled by a conglomerate ofbanks; the certification authority issues public key certificates.

Collaborating entities (a quorum of bank servers that constitutes the transaction center)can disclose the identity of the user. The same applies for payment unlinkability.

Page 41: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 41

Attacks: Corrupted or attacked transaction center. Dishonest quorum of mixes.

7.4.8. Proton system

The Proton system provides the user with an electronic wallet. The money is stored in asmart card that also contains keys. With it, the user can buy goods to payees who own aspecial module that is able to communicate with the smart card. Symmetric cryptographyis used.

This system does not provide anonymity or unlinkability in any sense towards the bank,which is an informed provider. It provides anonymity of the payer towards the payee.Different payments to the same payee can be linked together by the payee.

Concerning anonymity, this system is, among the described ones, the one that provides alower level of anonymity.

7.4.9. CEPS

CEPS stands for Common Electronic Purse Specifications. It is a system that allowsinteroperability between different smart card systems, internationally and involvingdifferent currencies. In order to achieve this goal public key cryptography replacessymmetric key cryptography for cross-domain security.

The anonymity properties of this system are equivalent to the properties of the Protonsystem.

7.4.10. Micropayments

Micropayments are designed to allow users to perform with exceptional efficiencypayments involving a very small amount of money (for example, to access non free webpages). The security properties of such systems are weaker in order to achieve higherefficiency. There are three types of entities in these systems: payer, payees, and brokers(who are providers). Two of these systems, PayWord and MicroMint will be examinedfrom the anonymity point of view [165].

The first scheme, PayWord, is a credit-based scheme. The broker knows the identity ofthe payer, and issues certificates to him that can be used by the payer to performmicropayments. The certificates contain the identity of the user, so the system does notprovide any of the anonymity requirements described previously (untraceability andunlinkability). With some modifications, the system could provide payer anonymity anduntraceability towards the payee, but never unlinkability since the efficiency goal isachieved by linking all payments to a particular payee.

The second scheme, MicroMint, is coin-oriented. Brokers produce coins that are sold topayers, payers spend these coins and payees return the coins to brokers. Collisionresistance properties of hash functions are exploited to make the task of forging coinsdifficult. In this scheme, payers are (or can be, depending on the particularimplementation) anonymous towards payees, and in case they use anonymous webbrowsing they can be untraceable. In this case different payments are unrelated, sopayment unlinkability is also achieved. Anonymity towards the broker depends on how

Page 42: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 42

the protocol is implemented, if the broker demands authentication from the user beforeselling him the coins then the anonymity requirements are not achieved. However, in thedescription of the protocol the broker keeps only information about valid coins, and notabout the payer who bought them, so we assume that the payer is anonymous.

7.4.11. Summary

System Anonymity towardsthe payee

Anonymity towards the bank

Blind Signatures yes yesCAFE yes yes

NetCash yes yes

Anonymous accounts yes yes, but not towards collaborating bankand trustee

Pseudonyms yes yes

Gemplus yes yes, but not towards collaborating bankand trustee

Mix-based yes yesProton yes noCEPS yes no

PayWord no noMicroMint yes yes

System Unlinkability UntraceabilityBlind Signatures yes yes

CAFE yes yes, unless there is doublespending

NetCash yes yes

Anonymous accountsno; different payments made with the

same anonymous account can belinked together

yes (except towardscollaborating bank and

trustee)

Pseudonymsno; different payments made under the

same pseudonym can be linkedtogether

yes (except towards thetrustee)

GemplusThe BO (trustee) can link together allthe payments made under the same

PID.

Collaborating CA and BOcan trace the payer and the

payment

Mix-based yes (except towards the trustee) yes (except towards thetrustee)

Protonno: the payee can link together the

payments made to him; the bank canlink all payments.

not towards the bank

CEPS no no

Page 43: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 43

PayWord no noMicroMint yes yes

System coin/account oriented Informed entity Unconditionalanonymity

Blind signatures coin - yesCAFE coin - no

NetCash coin Currency Server noAnonymous

accounts account trustee no

Pseudonyms account trustee noGemplus coin CA (bank) + BO (trustee) no

Mix-based account transaction center, quorumof mixes no

Proton account bank noCEPS account bank no

PayWord account broker noMicroMint coin broker no

Page 44: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 44

8. Electronic voting

8.1. Description of the system

8.1.1. Introduction

Voting through the Internet is an interesting topic that will increase participation, makingit more convenient, specially for people who live far away from their town, and for old orhandicapped people.

There is a strong need of anonymity in such a system. It is not acceptable that any partycan trace the vote cast by a voter, or the voter who cast a vote. This anonymity should bedurable and unconditional (non revocable).

Distributed trust techniques must be implemented to emulate traditional paper elections,and also some means to allow observers to verify the result of the election.

8.1.2. Voting phases

An electronic election can usually be divided into the following phases:

• Registration phase: During this first stage, a certain voting authority creates theelectoral roll and publishes it on the network. During a certain complaining period,voters should be able to post objections. After it, the final version of the electoral rollis published by the voting authority. During this phase, voters should get thecryptographic material to be used in the voting phase (keys, passwords,...).

• Voting phase: During the voting phase, voters can cast the desired ballots using thecommunication facilities of the network. The vote casting procedure may need asingle session with a ballot collection authority (which can be the same authority asthe one responsible for the electoral roll), Some voting schemes need more than asingle session, and it's not uncommon to have more than one voting authorityinvolved in the voting phase.

• Counting phase: At the end of the voting phase, the ballot collecting authority stopsaccepting ballots. The counting process is initiated. Finally, the tally is published andmade available through the network.

8.2. Different roles/entities in the systemThe different roles and parties we can find in an electronic election system are:

• The voter, who maps to the initiator in the described model. He will send a request tothe voting server to cast a ballot.

• In a voting system we can find several authorities, in most system there is at leastone verification authority and one separate counting authority.

• There is a responder to the initiator, who will accept the cast ballot. In some systems

Page 45: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 45

this is implemented as a bulletin board, which consists of a broadcast channel withmemory that can be observed and read by all parties. Each party controls her ownsection of the board in the sense that she can post messages exclusively to her ownsection, but not to the extent that she can erase or overwrite previously messages.

8.2.1. Attackers

Attackers to the system might be or not be involved in the voting process. Eavesdroppersare attackers who are not involved in the system and may try to listen to communicationat a local point or globally.

Another more dangerous type of attacker is a compromised authority. For this reason, thetrust should be distributed to avoid that a single corrupted authority is able to change theresult of the election, or to violate voters' privacy.

8.3. Anonymity requirements/properties

8.3.1. Anonymity related requirements

In voting schemes, privacy means untraceability of the published votes: the protocol mustnot reveal any information about which vote was cast by which honest user. That is, thevoter can not be traced to the vote, and the vote can not be traced to the voter.

The two requirements "voter privacy" and "vote secrecy" split this untraceabilityin two parts:

• Voter anonymity: Given a vote (ballot cast by the user) or any other informationobtained during the voting process, the system must not leak information aboutthe identity of the voter. In other words, the voter must not be identifiable, and thevoter must not be traceable.

• Vote secrecy: For any voter, the system must keep secret the vote (ballot) thevoter has cast. The vote must not be traceable, and for consecutive elections we mustassure also vote unlinkability.

Note that the list of eligible voters is public, as well as the final result of the votingprocess.

Many implementations and proposed protocols rely on the trust put into the authoritiesto protect anonymity. These systems do not implement, in fact, unconditionalanonymity because if several entities collaborate the voter anonymity and thevote secrecy can be broken, and the voter and the vote can be linked together.

For electronic elections the anonymity properties must be unconditional and durable.

8.3.2. Other requirements

Other useful requirements for an electronic election are [218]:

• Correctness: A system is correct if

1. it is impossible to modify a made vote;

Page 46: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 46

2. it is impossible to ignore a valid vote during the calculation of the end result.

3. it is impossible to count an invalid vote.

• Fairness: All ballots must stay secret until the voting phase has ended.

• Universal verifiability: Ensures that any party, including a passive observer, canconvince herself that the election is correct, i.e., inability of any number of partiesto influence the outcome of an election except by properly voting.

• Eligibility: Only eligible voters can cast a vote.

• Uniqueness: No voter can cast more than one vote.

• Non-coercibility (receipt-freeness): Ensures that no voter will obtain, as a result ofan execution of the voting scheme, a receipt that can prove how she voted. (Toprevent vote-buying and other tactics of voter persuasion.). No voter should be able toprove that he voted in a particular way.

• Revisability: A voter can change her vote within a given period of time.

• Provide for null ballots: Allow voters to null races or even the entire ballot as anoption (e.g., to counter coercion, to protest against lack of voting options).

• Allow undervotes: The voter may receive a warning of undervoting. However, sucha warning must not be public and must not prevent undervoting.

• Authenticated ballot styles: The ballot style and ballot rotation to be used by eachvoter must be authenticated and must be provided without any other control structurebut that given by the voter authentication process itself.

• No vote duplication: It should be impossible to copy another's vote (even withoutknowing what the copied vote is).

8.3.3. Requirements of the implementation

• Robustness: Ensures that the system can recover from the faulty behavior of any(reasonably sized) coalition of parties.

• Simplicity

• Technology Independent : A voting scheme is technology independent if it can beused in a variety of platforms and operating systems.

• Mobility: A voting scheme is mobile if there are no restrictions on the location fromwhich a voter can cast a ballot.

• Flexibility: A voting scheme is flexible if it allows a variety of ballot questionformats including open ended questions.

• Convenience : A voting scheme is convenient if it allows voters to cast their ballotsquickly, in one session, and with minimal equipment or special skills.

• No interaction between voters : For a large-scale election it is unreasonable torequire that the voters should all interact with each other as part of the voting

Page 47: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 47

protocol.

• Scalability: A voting scheme is scalable if it allows any number of voters toparticipate in the election.

• Open review, open code : Allow all source code to be publicly known and verified(open source code, open peer review). The availability and security of the systemmust not rely on keeping its code or rules secret (which cannot be guaranteed), orin limiting access to only a few people (who may collude or commit a confidencebreach voluntarily or involuntarily), or in preventing an attacker from observingany number of ballots and protocol messages (which cannot be guaranteed).The system should have zero-knowledge properties (i.e., observation ofthe system messages do not reveal any information about the system). Only keysmust be considered secret.

8.4. Short overview of existing systems

8.4.1. SafeVote

System overview

The anonymity goals of the SafeVote system are:

• voter privacy: the inability to know who the voter is.

• vote secrecy: the inability to know what the vote is.

• election integrity: the inability of any number of parties to influence the outcome ofan election except by properly voting.

Safevote's system is based on a n-party technology called Multi-Party™. The strategybehind the Multi-Party technology is:

• use a few proven and simple components;

• allow a large number of different connections of such components;

• define trusted introducers and trusted witnesses based on qualified reliance;

• make every connection have a trusted introducer and a trusted witness;

• define a multi-risk model where risk can be not only "average loss" but also"probability of loss" and/or "value at stake";

• favor multiple, independent communication channels over one "strong" channel;

• define clear evaluation criteria such as voter privacy, vote secrecy, and electionintegrity;

• put voter privacy as the first criteria.

Global architecture

The machines at the voting office are connected via an effectively unknown and changingIP address, and then in turn making connections to six other machines in unknown

Page 48: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 48

locations, again with unpublished IP addresses.

The Multi-Party technology uses Digital Vote Certificates™ (DVC) and ElectronicBallots™ comprising an end-to-end secure system that provides for fail-safe voterprivacy, cryptographically strong vote secrecy, and verifiable election integrity.

DVCs use a "thin client" model with the main part of the work being performed at theservers. Industry-standard SSL is used to authenticate the server to the client, but othermethods can also be employed as desired.

Security

Safevote uses multiple control structures and independent channels of information toincrease the reliability and trustworthiness of the network voting system, as well asauditing, vote recounting and verifiability of the election.

The DVC and Electronic Ballot components of the Multi-Party technology allow detailedreal-time auditing and post-election auditing by election officials, as well as allowingeach voter to verify on the Internet whether their vote was received at the servers withoutcompromising voter privacy, vote secrecy or election integrity.

The DVCs and their passwords are unknown at the very servers that will authenticate theDVCs. There is no voter authentication file to protect at the servers. Yet, DVCs canuniquely authenticate not only each voter to a server but also the ballot style designated toeach voter by the registration service, without identifying the voter and without requiringthe registration service to be online. DVCs provide for strong authentication and non-repudiation proofs within a closed-loop distributed control system. This enables an end-to-end security design that begins with voter registration and continues to ballot issuance,voting and tallying, which tallying can then be compared with earlier tamperproof entriesin voter registration tables.

8.4.2. VoteHere

System overview

VoteHere Platinum is designed to meet the stringent security, audit and verificationstandards for public-sector elections. Targeted for certification as an election system inthe United States and internationally, VoteHere Platinum provides the highest levels ofsecurity and privacy. The system can be configured in two ways: as an alternative toconventional poll-site equipment; or as a system to enable remote voting from the home,office, or anywhere one can access the Internet.

Security, privacy and verifiability standards for public elections are much higher than forprivate sector, proxy, union, or association elections. VoteHere.net’s patent-pendingtechnology ensures that public elections conducted online meet stringent standards forballot privacy, tabulation accuracy, election/vote verifiability and post-election audits.This is achieved through a complex system that combines high-level encryption for votetransmission, a "distributed trust" multi-authority method of tabulation, and thepreservation of ballot confidentiality while allowing for a publishable audit trail.

Online voting for the public sector is still in the trial stages, but interest and demand forthe service are gaining momentum. To date, VoteHere.net has conducted more than 15

Page 49: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 49

successful online elections in six states and is in the process of certifying its Platinumsystem for public sector elections in the United States. In March 2000, VoteHere filed aseries of patent applications nationally and internationally on its election systemtechnology.

Security

[Extracts from the VoteHere.net website:]

The VoteHere Election System prevents vote tampering and prevents voters from laterdenying that they cast a ballot by using digital signature technology. Your ballot is signedwith your digital certificate before it is sent over the Internet to the election data center.At the election data center, this digital signature is checked to ensure that the ballot hasnot been changed in transit.

The VoteHere Election System prevents anyone from discovering how a voter voted.Every voted ballot is encrypted using 1024-bit cryptography and individual ballots arestored in encrypted format at the Data Center. The VoteHere Election System's patent-pending technology protects ballot privacy by only decrypting and tabulating the electiontally as a whole; individual ballots are never decrypted.

The greatest threat to an election is compromise by an insider who has been bought.Election compromise must be protected in two areas. First, individual ballots must beprotected from tampering; ballots cannot be added, deleted, or changed. Second, thesystem must be based on distributed trust, where even the vendor cannot act as a trustedauthority. The VoteHere Election System utilizes a multi-authority tabulation systemwhich supports distributed trust. This is analogous to how elections are protected todayby election officials, party observers, and poll watchers. No individual can access theelection results, including VoteHere.net personnel.

Global architecture

After off-line authentication, the registered voter is provided with a digital certificate,typically stored on a floppy disk, to be used when they cast their ballot.

To cast a vote, the user accesses the VoteHere.net website through an Internet-connectedcomputer. Here the user is prompted for personal identification as well as their digitalcertificate. After authorization, the voter is presented with a personalized ballot.

When the voter casts his vote, the ballot is encrypted, digitally signed and sent to theVoteHere.net secure election center. All encrypted ballots are stored on indelible media.Once the polls close, election authorities and designated observers use cryptographic keysto decrypt only the election tally; the individual ballots remain encrypted.

[Extracts from the VoteHere.net website:]

Voters are authenticated either by poll workers at a poll-site or in the same manner thatmail-in or absentee voting is done today - through a voter declaration form submitted tothe county prior to the election. Once authenticated, the registered voter is provided witha digital certificate, typically stored on a floppy disk, to be used when they cast theirballot.

When the voter is ready to cast their ballot, they access an Internet-connected computer

Page 50: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 50

(either at the poll-site or remotely) and enter the VoteHere.net election website. The voterwill be prompted for personal identification as well as their digital certificate. Once thesystem acknowledges that the voter is eligible to vote, the voter is presented with apersonalized ballot containing lists of candidates, initiatives, and referenda specific totheir district. Using a mouse or other pointing device, voters make choices by clickingnext to candidate names or next to yes or no for ballot measures. At the end of theprocess, choices are displayed for confirmation and the voter clicks on a "Cast YourBallot" button.

At this time, the ballot is encrypted, digitally signed and sent to the VoteHere.net secureelection center. When the center receives a ballot, it checks the digital signature forverification and then removes the voter's name from the list of eligible voters, thuspreventing multiple ballot entries by the same registered voter - one person, one vote. TheVoteHere Election System stores all the encrypted votes on indelible media. For eachballot issue or race, the ballot box contains the voter's name followed by the 1024-bitencrypted string of alphanumeric characters indicating their choice. This allows theelection to be audited to verify who has voted without disclosing how they voted. Oncethe polls close, election authorities and designated observers use cryptographic keys todecrypt only the election tally; the individual ballots remain encrypted.

Techniques

[Extract from the VoteHere.net website:]

• Physical Data Center Security The VoteHere Gold system is hosted at a securefacility with controlled, monitored access 24 hours per day, 7 days per week.

• Data Center Infrastructure The data center utilizes uninterruptible power supplies,the physical space is protected with a non-destructive fire suppression system, and thedata center is climate controlled with redundant cooling systems.

• Internet Connectivity Provider Internet connectivity is provided by a high-end vendoroffering premium services for mission critical applications. The vendor utilizesprivate network access points to work around Internet congestion and has developedintelligent routing technology to directly deliver data to and from destinations in afaster more reliable manner.

• System Architecture The system is designed for high availability, redundancy,scalability, simplicity, and security. Extraneous functionality has been removed tosimplify the system and remove possible security threats. The system does not run ona general Internet connection and is designed at every level to only allow traffic andcapabilities necessary for running, managing, and monitoring the election. Before anyballot is written to the database, it is first written to optical media that serves as anoriginal copy of all ballots received.

• Security Monitoring The VoteHere Gold system utilizes three levels of monitoring.The ISP monitors connectivity and the state of the infrastructure 24x7. CounterpaneInternet Security, Inc. provides preeminent service in the area of 24x7 managedsecurity monitoring and provides our second level of security. The VoteHereoperations staff provides the final level of monitoring.

Page 51: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 51

• PIN Structure VoteHere provides 10-digit alphanumeric PINs to ensure the integrityand security of the voting process. A person would have a 1 in100,000,000,000,000,000 chance of guessing a VoteHere generated PIN.

• System Logs The system provides extensive logs of the entire election process, whichcan be analyzed in many forms. The logs can be used to detect intrusion activity andattempts at election fraud and also serve as a comprehensive audit trail.

• Secure Transfer and Storage of Data Completed ballots are transmitted over theInternet using Secure Sockets Layer (SSL) encryption. Ballots are received at theVoteHere.net election data center, encrypted using 1024 bit encryption and stored ona secure server. In order to ensure voter privacy, the votes are tallied without everbeing decrypted.

• System Monitor The VoteHere Gold System Monitor provides our customers withunparalleled remote access to real-time, in-depth information related to their onlineelection or survey. The System Monitor can be accessed through a standard webbrowser from any Windows-based computer and does not require plug-ins oradditional software.

8.4.3. CyberVote

CyberVote, an innovative cyber voting system for internet terminals and mobile phones,is a research and development (RDT) program being funded by the EuropeanCommission, with additional funding from the companies and organizations undertakingthe work. It is part of the Information Society Technologies (IST) 1999 program forresearch, technology development and demonstration under the fifth framework program(5th PCRD). It is attached to Key Action 1 "Systems and Services for the Citizens".

System overview

A first goal of the CyberVote project is to design and test voting systems for which theunderlying cryptographic protocols fulfil a rich set of security properties. In particular, itis required that the voting system is universally verifiable, which means that any partycan verify that the election result actually corresponds to the encrypted votes cast duringthe election and furthermore that ballot secrecy is controlled by a set of talliers of anysize deemed appropriate. That is, ballot secrecy is not necessarily dependent on a small,fixed number of parties but can be scaled to any desirable number of parties, which wecall scalable distributed trust. Hence, among other things, the underlying cryptographicprotocols are designed to satisfy the seemingly conflicting requirements of universalverifiability and ballot secrecy.

Clearly, good overall security of the election system is not simply guaranteed by thestrength of the underlying protocols, but weaknesses in these protocols can never becompensated for by additional security measures. Any appropriate state of the art securitymeasures will be applied to achieve good overall security of the CyberVote system.

A second goal of the CyberVote project is to extend the platform for voting clients fromPCs to other networked devices such as mobile phones and possibly TV settop boxes.Availability of voting clients on these devices will provide greater convenience to the

Page 52: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 52

voters. Some of the challenges are to implement the above mentioned cryptographicprotocols, which require large-integer arithmetic, and to find suitable user-interfaces forthese constrained devices. An important part of the CyberVote effort will be devoted tothese issues. The other projects considered in this deliverable only target PCs.

Finally, a third goal of the CyberVote project is to take legal issues for binding, publicelections into account. While private elections allow for considerable freedom, the rulesfor public elections are generally much more stringent. Also, the goal is to make theCyberVote system compatible with the rules of several countries at the same time, ratherthan limiting the scope to a single country. For example, compulsory voting may besupported by the CyberVote system, although it is not a requirement in every country.Further, the interaction between the CyberVote project and legislative bodies is supposedto be bi-directional, that is, the CyberVote project tries to match current and emergingrequirements for voting systems and, at the same time, the CyberVote project tries toassist the development of new legislation pertaining to Internet-based voting systems.

Page 53: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 53

9. Electronic auctions

9.1. Description of the systemThe fundamental goal of auction systems is the distribution of resources among a numberof bidders. This distribution is based on pre-defined rules to determine the actualbuyer(s), the selling price (clearing price), etc.

Different types of auctions are practiced to achieve different business objectives such asbest price, guaranteed sale, minimum collusion possibility, etc. In this chapter we firstreview the different kinds of auctions and their properties. Next we discuss the steps of acomplete auction based trading process [118].

9.1.1. Different auction properties

1. Bid confidentiality

We say the bids are Open if the bid amounts are known to all bidders during the auction.in a Sealed bid auction the bid amounts are only known by the bidder until the auctioncloses. What happens to the bids after the auction is closed will be discussed in section9.3.

2. Bid cancellation

The rules of the auction can specify if and when buyers can cancel their bids.

3. Auction winner(s)

The rules of the auction should specify to means to determine the winner or winners ofthe auction. Normally the winner is the buyer with the highest bid amount. The rulesshould also specify what happens when two or more buyers bid the same, highest,amount.

4. Selling or clearing price

Once the winners are declared the clearing price has to be set. An auction's clearing pricerepresents the exchange terms agreed upon by the bidders as a result of the auction. Eachclearing price is guaranteed to be consistent with the expressed willingness of thematched parties. Since there may in general be many results consistent with the submittedbids, the rules of the auction specify exactly how the clearing price is determined. Thisprice can be the bidding price or lower. In a discriminative auction, the winners pay whatthey bid. In a non discriminative auction people with winning bids pay the price paid bythe winning bidder with lowest bid. Finally, in an auction for a single item, in a secondprice auction the winner pays the price bid by the second highest bidder.

5. Auction closings

Different rules are used to determine when an auction round closes. The most commonmethods are:

- Expiration time (both for open or sealed auctions): The auction closes at apredetermined expiration time.

Page 54: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 54

- Timeout (for open bid auctions): The auction closes when no bids higher than thecurrent high bid are made within a predetermined timeout interval.

- Combination of expiration time and timeout (for open bid auctions): The auctioncloses when there is a timeout after the expiration time.

- Stock-out or price falls below a pre-specified level (for Dutch auctions, seebelow): The auction closes when all the goods are sold or when the price fallsbelow a pre-specified level – to protect the sellers.

After the auction closing the winners are determined and the clearing price(s) are set.

In a multiple-round auction the auction has more than one closing; here the auction endsafter the final closing.

9.1.2. Auction types

There are many auction types [118] but we will only discuss the most common types: theEnglish or open cry auctions, the Dutch (downward-bidding) auction, the sealed bidauction and the double auction.

1. English auctions

In an English auction, also known as the open-cry or ascending-price auction, buyers bidthe highest price they are willing to pay for an item and bidding activity stops when thereis no other higher bid or the auction duration is complete. It is called open-cry becauseeach bidder can hear the bid submitted by rival buyers - so this is an open bid auction.After a buyer has made his bid, rival bidders have a limited time to respond to it with ahigher counter-bid.

The winner is the buyer with the highest bid at the end of the auction. Normally there aretwo ways an auction can be close: the auction is closed after a time-out (no counter-bidwas placed during the response time) or the auction is closed after a pre-defined duration.The item is sold to the highest bidder at the expressed bid price. Sometimes theauctioneer will set a reserve price (the lowest acceptable price) for an item. In this case,when the reserve price is not met, the item is not sold.

One variation on the open-outcry auction is the open-exit auction in which the prices risecontinuously, but players must publicly announce that they are dropping out when theprice is too high. Once a bidder has dropped out, he or she may not reenter. This variationprovides more information about the valuations (common or public) of others then whenplayers can drop out secretly (and sometimes even reenter later).

In another variation, an auctioneer calls out each asking price and bidders lift a paddle toindicate a willingness to pay that amount. Then the auctioneer calls out another price etc.

An important property of an English auction is that bidders may continuously re-assesstheir evaluation and bid again. Sometimes bidders can get carried away with enthusiasmand start a bidding competition, bidding higher and higher. The winner then finally endsup paying a price that is significantly higher than the item's value. This is called Winner'sCurse. So the key to a successful auction (from the seller's point of view) is the effect of

Page 55: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 55

competition on the potential buyers. But with experienced buyers, the seller normallydoes not receive maximum value for his item and other auction types may be superior tothe English auction for this reason.

2. Dutch auctions

In a Dutch auction, bidding starts at an extremely high price and is progressively lowereduntil a buyer claims an item by calling "mine", or by pressing a button that stops anautomatic clock. At this time the buyer specifies how many goods he wishes to buy at thecurrent price. When multiple units are auctioned, normally more takers press the buttonas price declines. In other words, the first winner takes his prize and pays his price andlater winners pay less. When the goods are exhausted, the bidding is over and the auctionis closed. The rules of the auctioneer can also specify that the auction closes once theasking price drops below a specified minimum price.

In a descending auction there is no possibility for re-assessment and correct evaluation ofthe market before the start of the auction is important for the buyers.

An advantage (for the seller) is that a buyer with high interest in the item cannot afford towait too long; so the buyers will bid at or very near their evaluation of the item's value.This auction type is preferred by professionals because items are auctioned at prices nearthe real price and so it is fair both for the sellers and the buyers.

3. Sealed bid auctions

Obviously the bids in a sealed-bid auction are submitted in sealed envelopes. The buyersare required to submit their bids by a specified deadline. After this bidding round theenvelopes are opened and the winners are declared. In a first-price sealed bid auction, thehighest bidder wins and pays the amount he secretly bid.

In a discriminatory auction (more than one item for sale), bids are sorted form high tolow, and the items are awarded at highest bid price until the supply is exhausted. Sowinning bidders usually pay different prices for the same goods.

A variation on the first-price sealed bid auction is the Vickrey-type auction (only a singleitem for sale). Here the item is sold at the buyer with the highest bid, but he or shepurchases this item at the second highest bid-price.

In a multi round sealed bid auction there is a deadline for each round of bids, and at thatdeadline either the auction is closed or the bids from the current round are publicized anda fresh round of bids starts. The rules of the auction have to specify when the auctionfinally closes (for example after three rounds).

4. Double auctions

In this auction both sellers and buyers submit bids which are then ranked highest tolowest to generate demand and supply profiles. From the profiles, the maximum quantityexchanged can be determined by matching selling offers (starting with lowest price and

Page 56: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 56

moving up) with demand bids (starting with highest price and moving down). This formatallows buyers to make offers and sellers to accept those offers at any particular moment.

In a continuous double auction bids are matched in the order that they are received (eitherfrom a seller or from a bidder). For every new bid the auctioneer checks whether theoffered price matches the lowest existing sell bid, and vice versa. On successful matchingthe item is sold at the matching price.

One interesting variation is the Double Dutch auction. Here the buyer price clock startsticking at a very high price and continues downward. At some point the buyer stops theclock and bids on the unit at a price favorable to him. At this point a seller clock startsupward from a very low price and continues to ascend until stopped by a seller who thenoffers a unit at that price. Then the buyer clock resumes in a downward direction. Thetrading period is over when the two prices cross, and at that point all purchases are madeat the crossover point.

9.1.3. Different steps in an auction process.

1. Initial buyer and seller registration

This step deals with the authentication of the trading parties, exchange of cryptographickeys, etc.

2. Setting up a particular auction event

In this step the item being sold is described and the rules of the auction are negotiated anddefined. These rules explain the type of auction (open cry, sealed bid, …) and otherparameters like terms of payment, starting date and time, etc.

3. Scheduling and advertising

To attract potential buyers.

4. Bidding

This step handles the collection of bids and implementation of bid control rules of theauction (minimum bid, bid increment, deposits required with bids, …) and for open cryauctions notification of the participants when new high bids are submitted.

5. Evaluation of bids and closing the auction:

Implementation of auction closing rules and notification of the winners.

6. Trade settlement

This final step handles the payment to the seller, the transfer of the goods to the buyer,payment of fees to the auctioneer and other agents, etc.

9.2. Different entities in the systemIn the different auction systems discussed above a number of parties are involved:

1) Traders:

Page 57: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 57

- Buyers (initiator): Places bids in order to purchase the item for sale.

- Seller (initiator/inactive): Has one or more items he/she wishes to sell; in doubleauction the seller also places (sell) bids. If the seller does not place any bids, thenthe seller is not considered an entity that participates in the system (the sellergives the goods to the auctioneer and the auctioneer participates in the system).

- Auctioneer (provider/responder): Arranges the auction and normally shouldenforce the specific rules of the auction. A trusted third party can be involved toenforce the auctioneer to be “fair” towards the buyers and the sellers. Dependingon the actual system this provider can be informed or uniformed. The auctioneerhas also the role responder given that it will react to the actions of buyers (andseller if needed).

2) Trusted third party (authority)

From the above description of the different systems, we can see that both buyers andsellers can be actively involved in the auction process. Therefore the auctioneer (who hasno interest in the outcome of the auction process) is trusted to make sure that the auctionrules are obeyed. Sometimes it is necessary to enforce “fairness” of the auctioneer (forexample using a third party that can take legal actions if necessary).

3) Product(s)

These are the items for sale. In a reversed auction there is one seller, for example acompany that wants to build something, and a number of buyers, entrepreneurs who notonly bid a price but an entire set of terms. Here the product is the service of theentrepreneurs.

9.3. Anonymity requirements/propertiesIn order to conduct a fair auction a number of requirements have to be met. In the “realworld” some of these requirements are easily met because buyers and sellers arephysically present at the auction and can check whether the rules are obeyed by the otherparties. In the case of online auctions, (cryptographic) techniques will have to be used toenforce these requirements. In this chapter we will give an overview of the differentrequirements that have to be met.

9.3.1. Requirements unrelated to anonymity

1) Bidder Integrity

Only authorized buyers can submit a bid. In online auction systems bidders will haveto prove some credentials (possibly in an anonymous way) to the auctioneer in orderto be allowed to the auction process.

2) Bid Integrity

A submitted bit cannot be modified. This means that appropriate measures have to betaken to protect the integrity of the digital bids used in online auctions.

3) Bid confidentiality

Page 58: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 58

Only for sealed bid auctions. This requirements is relative easily met using theappropriate existing cryptographic techniques to encrypt (seal) the bids.

4) Robustness

Ensures that the system can recover from the faulty behavior of any (reasonablysized) coalition of parties.

5) Fairness

- Every authorized bidder should get the opportunity to out-bid

- Non-discrimination of bids: the auctioneer cannot discriminate between bidsbased on the bidder or bid amount.

- Ordering of bids (Dutch auction): only bids made between a reasonably smallinterval of the present time can be reordered by the auction implementation.

- Timely bids: Bids can only be committed (i.e. taken into account during thedetermination of the winners and clearing price) if they are submitted before theprescribed auction closing.

6) Non-repudiation

Once a buyer submitted a bid he or she can no longer deny this.

7) Verifiability

Ensures that bidders and sellers can verify the correct operation of the auction, i.e.that the winner and clearing price is computed fairly from the bids that were correctlysubmitted.

8) Correctness

A system is called correct if

(1) it supports bidder integrity,

(2) it supports bid integrity,

(3) it is fair,

(4) auction rules (start and end time, etc.) are not compromised.

9) Other

- For open cry auctions, spurious bids, injected by the seller to prompt the highestbidder to further increase his bid, should be prevented.

9.3.2. Anonymity related requirements

Depending on user preferences, different anonymity properties have to be provided by theauction system implementation.

• Bidder anonymity: the identity of the bidder must remain anonymous towards theother parties.

Page 59: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 59

• Bidder/bid untraceability: it should be impossible for an unauthorized party to tracethe bidder from a particular bid or vice versa.

Note 1: Anonymity is phase dependent. Bidders can choose to stay anonymous duringthe bidding phase of an auction. Once the auction is closed the following can happen:

- All bidders are revealed (but the bids stay sealed except for the winning bids).

- Only the winners are revealed to all parties that participated in the auctionprocess.

- Only the winners are revealed and only towards the seller and/or the auctioneer.

- The winners stay anonymous (then means of anonymous payments and shippinghave to be provided.)

Note 2: Unlinkability is intrinsically provided by the system given that the completeprocess is done from the beginning (registration phase) every time a new auction starts.

• Seller anonymity (optional): The seller of an item can choose to stay anonymoustowards the buyers and/or the auctioneer. If the seller wishes to stay anonymous,means of anonymous payments and shipping have to be provided.

Also maybe a means to prove some credentials without revealing the true identityshould be provided. This way buyers assure themselves that they are not buyingstolen goods, etc.

Some anonymity control must be provided in order to prevent abuse.

9.4. Short overview of existing solutionsDifferent solutions have been proposed in the literature. In this chapter we give a shortreview of selected proposals.

Most proposals focus on sealed bid auctions and different approaches are used to achievethis goal. Some proposals like [79][114] use secret-sharing primitives to distribute thevalue of a bid among many trustees. When at least a minimum threshold of the trusteesare honest, they will not assist in opening the bid before the closing period. This approachgenerally results in inefficient systems, when public verifiability is required. This isbecause there exist no efficient protocol for publicly verifiable encryption [185] which isan essential building block for publicly verifiable secret sharing schemes. The otherapproach to publicly verifiable secret sharing is that of Schoenmakers [179], which ismore efficient than the scheme by Stadler [185]. However, its application to the auctionscheme will probably remain inefficient.

Harkavy, Tygar and Kikuchi [93] proposed an auction scheme based on securedistributed computing primitives. This proposal is also only moderately efficient. Theygive an example for 1000 bidders, 1024 possible prices and 4 auctioneers (of which 2have to cooperate in order to violate privacy). In this case a total communication for eachauctioneer of 1.2 MBytes in 7000 messages is necessary for bid submission. Bidresolution requires another 26.2 MBytes in 288 messages over 48 communication roundsfor each auctioneer.

Page 60: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 60

Viswanathan, Boyd and Dawson [196] claim to have a sealed bid auction scheme whichis computationally more efficient than the schemes we discussed above. Their systemconsists of two subsystems, an anonymity subsystem that provides anonymity to all itsusers and an auction subsystem that allows the users to participate in the auctionprocedure.

Other auction schemes can be found in [79] and [135].

Page 61: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 61

10. Legal issues

10.1. IntroductionIn order to guarantee an appropriate implementation of the various technical applicationsdescribed in the subsections above, legal certainty should be achieved too. What followsare examples of possible legal issues that arise when we focus on the legal framework ofon-line anonymity services.

The applications described earlier in this report, are part of the so-called category of on-line services or "information society services" [65], that is to say, any service normallyprovided for remuneration, at a distance, by electronic means and at the individualrequest of a recipient of services. [66] Consequently, these services are covering a widerange of economic activities in general, performed via electronic way.

Providers of on-line services are subject to a number of specific legal requirementsadapted to the specificity of the on-line environment, besides general rules on contractlaw, commercial and public law. The idea behind those specific legal requirements is onthe one hand to legally protect recipients of on-line services and to enhance trust fromtheir side to perform contracts via e-commerce. [68] On the other hand, there is a need tomaintain a balance of, sometimes, conflicting interests - e.g. regarding the liability regime- in order to support the development of e-commerce in general.

As last chapter to the present technical report, we will at this stage limit our overview to anon exhaustive number of possible legal issues, in connection with specific legalrequirements, related to the peculiarity of the on-line environment only.

10.2. Possible legal issues

10.2.1. General information requirement

There is a general requirement for the natural or legal person providing an informationsociety service, hereafter called on-line service provider, to ensure that informationrelating to his identity and location would be easily, directly and permanently accessibleto the recipients of the service.

Examples of this information requirement are: the name of the on-line service provider,the geographic address at which the on-line service provider is established, the details ofthe on-line service provider including his electronic mail address, number of traderegister, VAT etc.

The purpose of identification and locating of the on-line service provider is to ensureeffective implementation of contractual and legal requirements so that effective redresswould be possible in case of damage, in the same way as in the off-line environment.

Page 62: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 62

10.2.2. Specific transparency requirements for commercialcommunications

If the on-line service provider launches commercial advertising for purposes ofpromoting his services, this has to done in a recognizable and transparent way, in order toavoid any misleading information.

Consequently, the on-line service provider has to ensure an absolute transparency on hisidentity as well as on the content of the commercial communication and promotionaloffers. The commercial communication should be recognized easily, e.g. by means ofbanners.

10.2.3. Pre-contractual information requirement

Prior to any request ordered by the recipient of an on-line service, the provider of theservice has to inform the recipient on specific modalities regarding the conclusion of thecontract, such as the different technical steps to follow to conclude the contract, theappropriate and accessible technical means for identifying and correcting input errorsprior to the placing of an order for delivering a service. This information should be givenin a clear, comprehensive and unambiguous way.

Furthermore contract terms and general conditions provided to the recipient must bemade available in a way that allows him to store and reproduce them.

In order to clearly establish the conclusion of the contract, the service provider has toacknowledge receipt of the recipient's order without undue delay and by electronicmeans.

10.2.4. Liability of on-line service providers

• liability in connection with illegal content of information

This liability regime is especially relevant for on-line anonymity services and regards therelation between the on-line service provider and third parties, incurring damage due tothe illegal content. It does not affect the contractual relationship between the provider andthe recipient of the on-line service nor the relationship between the on-line serviceprovider and public authorities.

On-line service providers have a duty to act, under certain circumstances, with a view topreventing or stopping illegal activities. Consequently, exemptions of this kind of liabilitycover only cases where the activity of the provider of on-line services is limited to thetechnical process of operating and giving access to a communication network over whichinformation - made available by third parties - is transmitted or temporarily stored for thesole purpose of making the transmission more efficient.

This activity is of a mere technical, automatic and passive nature, which implies that theon-line service provider has neither knowledge nor control over the information that istransmitted or stored.

On-line service providers can benefit from the exemptions for "mere conduit" and for

Page 63: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 63

"caching" when they are in no way involved with the information transmitted. Thisrequires that they do not initiate the transmission, nor select the receiver of thetransmission. Manipulations of a technical nature that take place in the course of thetransmission and do not alter the integrity of the information contained in thetransmission, are not considered as ways to be involved in the transmitted informationand therefore are subject to exemption of liability.

An on-line service provider who deliberately collaborates with one of the recipients of hisservice in order to undertake illegal acts goes beyond the activities of "mere conduit" or"caching" and as a result cannot benefit from the liability exemptions established forthese activities.

Besides exemptions of liability in case of "mere conduit" or "caching", the on-line serviceprovider, consisting of the storage of information, can also benefit from a limitation ofliability when he has actual knowledge or awareness of illegal activities and he actsexpeditiously to remove or to disable access to illegal information.

• Liability of certification service providers

Certification service providers, issuing certificates or providing other services related toelectronic signatures, cannot be prevented from indicating in the certificate a pseudonyminstead of the signatory's name [67]. In this context, the liability regime of certificationservice providers is relevant to the present research on anonymity and privacy inelectronic services.

A specific liability regime is applicable on certification service providers of qualifiedcertificates with a view to create, as stated earlier, a balance of interests betweencertification service providers and recipients of certificates.

This specific liability regime regards the relation between certification service providersissuing qualified certificates and any entity or legal or natural person who reasonablyrelied on that certificate. It does not affect the contractual relationship betweencertification service providers and the recipient of a certificate nor the relationshipbetween the certification providers and public authorities.

Three aspects regarding the specific liability regime should be considered:

1. The certification service provider is liable for the damage resulting from theinaccuracy and incompleteness of information contained in the qualifiedcertificate at the time of the issuance of the certificate. Indeed, one can reasonablynot expect that the certificate service provider would permanently verify theaccuracy of the information. This is a responsibility of the recipient of thecertificate, who possibly will have to revoke the certificate.

2. The certification service provider should also guarantee that the recipient of thecertificate holds, at the time of the issuance of the certificate, the signature-creation data corresponding to the signature-verification data given in thecertificate. If the certification service provider generates both, he should assurethat they can be used in a complementary manner.

3. Finally the certification service provider should ensure that the date and time of

Page 64: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 64

revocation of the certificate are accurately registered.

The certification service provider is liable for damage caused by non compliance of theabove mentioned obligations, unless the certification service provider proves that he hasnot act negligently. It could be for instance envisageable that the certificate serviceprovider registered the revocation of a certificate via a register accessible on his website,but that third parties had no access to the website for a reason out of control of thecertification service provider

The certificate service provider can limit his liability on two grounds only: by indicatingin a qualified certificate limitations on the use of that certificate or on the value oftransactions for which the certificate can be used, provided that the limitations arerecognisable to third parties.

The examples of possible legal issues briefly described above, will be incorporated in ageneral overview of legal issues and further examined in detail in the report on generallegal aspects of anonymity.

11. Future workIn this document only a selection of all the possible applications is discussed. In thefuture we might look at the other applications in more detail.

In further steps we will analyze in depth the technical solutions.

Another important task for the future is the development of a mathematical model thatallows us to measure the different degrees of anonymity (see chapter 2.3.2).

The impact of providing anonymity in other requirements (e.g. performance) should beconsidered and measured, because some solutions might make the system impractical.

Page 65: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 65

References

[1] M. Abe, “Universally Verifiable Mix-Net with Verification Work Independent of the Numberof Mix-Servers”, Advanced in Cryptology - Eurocrypt 1998, Springer-Verlag LNCS 1403, pp.437 ff, 1998

[2] R.J.Anderson, “The Eternity Service”, Pragocrypt 1996, 1996 [3] N. Asokan, “Anonymity in a mobile computing environment”, 1994 [4] A. Bacard, “The Computer Privacy Handbook”, Peachpit Press, Berkeley, California, 1996 [5] A. Bacard, “Anonymous Remailer FAQ”. http://wwww.well.com/user/abacard/remail.html. [6] D. Balenson, “Privacy enhancement for internet electronic mail: Part III: Algorithms, modes,

and identifiers”, Network Working Group Request for Comments RFC 1423, IETF, Feb. 1993 [7] D. Beaver, “Perfect Privacy for Two-Party Protocols”, TR-11-89, Harvard University, 1989 [8] V. Bellotti and A. Sellen, “Design for Privacy in Ubiquitous Computing Environments”, Proc.

3rd European conf. on Computer Supported Cooperative Work , (ECSCW 93), G. de Michelis,C. Simone and K. Schmidt (Eds.), Kluwer, 1993, 77-92

[9] Benaloh and M. Yung, “Distributing the power of a government to enhance the privacy ofvoters”, In Proc. 5th ACM Symposium on Principles of Distributed Computing (PODC '86),pages 52--62, New York, 1986. A.C.M.

[10] J. Benaloh, “Verifiable Secret-Ballot elections”, PhD thesis, Yale University, Department ofComputer Science, New HAven, CT, September 1987.

[11] J. Benaloh and D Tuinstra, “Receipt-free secret-ballot elections”, In Proc. 26th Symposium onTheory of Computing (STOC '94) , pages 544-553, New York, 1994. A.C.M.

[12] C. H. Bennett, G. Brassard, C. Crépeau and U. M. Maurer, “Privacy amplification againstprobabilistic information”, in preparation.

[13] C. H. Bennett, G. Brassard, C. Crépeau and U. M. Maurer, “Generalized privacy amplification”,IEEE Transactions on Information Theory, vol. 41, Nov. 1995. (To appear).

[14] O. Berthold, H. Federrath and M. Kohntopp, “Anonymity and unobservability on the Internet”,In Workshop on Freedom and Privacy by Design : CFP 2000, 2000.

[15] M. Blaze, J. Feigenbaum and J. Lacy, “Decentralized trust management”, IEEE Symposium onSecurity and Privacy, Oakland CA, May 1996.

[16] D. Bleichenbacher, E. Gabber, P. Gibbons, Y. Matias and A. Mayer, “On personalized yetanonymous intercation”, Technical report, Bell Laboratories, April 1997. 128

[17] C. Blundo, L. A. Frota Mattos and D. R. Stinson, “Multiple Key Distribution Maintaining UserAnonymity via Broadcast Channels”, J. Computer Security 3 (1994/95), 309--323.

[18] J. Boly, A. Bosselaers, R. Cramer, R. Michelsen, S. Mjolsnes, F. Muller, T. Pedersen, B.Pfitzmann, P. de Rooij, B. Schoenmakers, M. Schunten, L. Vallee, M. Waidner, “The ESPRITProject CAFE - High Security Digital Payment Systems”

[19] J. Bos, D. Chaum and G. Purdy, “A Voting Scheme”; unpublished manuscript, presented at therump session of Crypto '88.

[20] S. A. Brands, “Electronic cash on the internet”. [21] S. A. Brands, “Untraceable Off-line Electronic Cash Based on Secret-key Certificates.” Latin

95. [22] S. A. Brands, “Untraceable Off-line Cash in Wallets with Observers”, Proceedings of Crypto

'93, LNCS 773, Springer Verlag, pp. 302-318. [23] E. Brickell, P. Gemmel and D. Kravitz, “Trustee-based Tracing Extensions to Anonymous Cash

and the Making of Anonynous Change”, Proceedings of 6th annual Symposium on DiscreteAlgorithm (SODA) , 1995.

[24] J. Camenish, J.-M. Piveteau and M. Stadler, “An Efficient Payment System Protecting Privacy”,Proceedings of ESORICS '94, LNCS 875, Springer Verlag, pp. 207-215.

[25] J. Camenish, J.-M. Piveteau and M. Stadler, “An Efficient Fair Payment System”, Third ACMConference on Computer and Communications Security, 1996.

[26] J. Camenish, U. Maurer and M. Stadler, “Digital Payment Systems with Passive Anonymity-Revoking Trustees”, Computer Security - ESORICS 96.

[27] L. Jean Camp, M. Harkavi, B. Yee, J.D. Tygar, “Anonymous Atomic Transactions”, 2nd

Page 66: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 66

Annual USENIX Workshop on Electronic Commerce, Nov. 1996, Oakland (CA), pp. 123-134. [28] L. J. Camp, “Web security & privacy: An American perspective”. ACM SIGCAS CEPC '97

(Computer Ethics: Philosophical Inquiry). Previous version presented as "Privacy on the Web",The Internet Society 1997 Symposium on Network & Distributed System Security, 10-11February 1997, San Diego, CA.

[29] R. Canetti and R. Gennaro, “Incoercible multiparty computation”, In 37th IEEE Symposium onFoundations of Computer Science (FOCS '96) , 1996.

[30] T. Casey and S. Wilbur, “Privacy Enhanced Electronic Mail”, Proceedings of the FourthAerospace Computer Security Applications Conference, pp. 16-21 (Dec. 1988).

[31] D. Chaum, “Blind Signatures for Untraceable Payments”, Advances in Cryptography -Proceedings of Crypto '88.

[32] D. Chaum, “Security without Identification: Transaction Systems to make Big BrotherObsolete”, Communications of the ACM 28/10 (1985).

[33] D. Chaum, “The Dining Cryptographers Problem, Unconditional Sender Anonymity”, Draft,received May 13, 1985;.

[34] D. Chaum, “Privacy Protected Payments -Unconditional Payer and/or Payee Untraceability”,SMART CARD 2000: The Future of IC Cards, Proceedings of the IFIP WG 11.6 InternationalConference; Laxenburg (Austria), 19.-20. 10. 1987, North-Holland, Amsterdam 1989.

[35] D. Chaum, “Achieving Electronic Privacy”, Scientific American Cash; Crypto '88. [36] D. Chaum, A. Fiat and M. Naor, “Untraceable Electronic Cash”, Crypto '88. [37] D. Chaum and H. van Antwerpen, “Undeniable Signatures”, Crypto '89. [38] D. Chaum, “Untraceable electronic mail, return address, and digital pseudonyms”,

Communications for the ACM, ACM 1981. [39] D. Chaum, “Privacy Protected Payments. Unconditional Payer and/or Payee Untraceability”,

received October 31, 1986 [40] D. Chaum, “Verification by anonymous monitors”, In Allen Gersho, editor, Advances in

Cryptology: A Report on CRYPTO 81, pages 138--139. U.C. Santa Barbara Dept. of Elec. andComputer Eng., 1982. Tech Report 82-04.

[41] D. Chaum, “Showing credentials without identification: Transferring signatures betweenunconditionally unlinkable pseudonyms”, In Auscrypt '90, pages 246--264, Berlin, 1990.Springer-Verlag.

[42] D. Chaum, I. B. Damgaard and J. van de Graaf, “Multiparty computations ensuring privacy ofeach party's input and correctness of the result”, Advances in Cryptology - CRYPTO '87Proceedings, Springer-Verlag , 1988, pp. 87-119.

[43] L. Chen and T. P. Pedersen, “New group signature schemes”, In Advances in Cryptology-EUROCRYPT '94, volume 950 of Lecture Notes in Computer Science, pages 171-181, Berlin,1995. Springer-Verlag.

[44] P. Choonsik, I. Kazutomo and K. Kaoru, “All/Nothing Election Scheme and AnonymousChannel”, EUROCRYPT '93, Pre-proceedings, Lofthus, May 1993, T97-T112.

[45] J. Claessens, B. Preneel and J. Vandewalle, “Anonymity controlled electronic paymentsystems”, 20th Symposium on Information Theory in the Benelux, Haasrode, Belgium, 27-28May 1999, pp. 109-116.

[46] J. Claessens, B. Preneel and J. Vandewalle, “Solutions for Anonymous Communication on theInternet”, Proceedings of the IEEE 33rd Annual 1999 International Carnahan Conference onSecurity Technology (ICCST'99) , 1999, pp. 298-303.

[47] R. Clarke, “Introduction to Dataveillance and Information Privacy, and Definitions of Terms”,http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html

[48] R. Clarke, “Computer Matching and Digital Identity”, 1993,http://www.anu.edu.au/people/Roger.Clarke/DV/CFP93.html.

[49] B. C. Neuman, “Security, Payment, and Privacy for Network Commerce”, IEEE Journal, Vol.13 No. 8 October 1995.

[50] C. Clifton and D. Marks, “Security and privacy implications of data mining”. In Proc. 1996SIGMOD '96 Workshop on Research Issues on Data Mining and Knowledge Discovery(DMKD'96), pages 15--20, Montreal, Canada, June 1996.

[51] J. Cohen and M. Fischer, “A robust and verifiable cryptographically secure election scheme”, InProc. 26th IEEE Symposium on Foundations of Computer Science (FOCS '85), pages 372-382.

Page 67: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 67

IEEE Computer Society, 1985. [52] Community ConneXion, “Anonymous Surfing”, 1996. http://www.anonymizer.com/ [53] L. Cottrell, “Mixmaster & Remailer Attacks”. [54] L. Cottrell. “Remailer Essay”, http://www.obscura.com/~loki/remailer/remaileressay.html,

Describes the current state of the art in Internet anonymity. [55] P. Covell , “Digital Identity in Cyberspace”, 1998,

http://cyber.law.harvard.edu/courses/ltac98/white-paper.html. [56] B. Cox, “Maintaining Privacy in Electronic Transactions”, Technical Report TR 1994-9,

Carnegie Mellon University Information Networking Institute, Pittsburg, PA, September 1994. [57] P. Cox, “An investigation into the security issues surrounding data transmission and user

anonymity using the global system for mobile communications” [58] R. Cramer and T. Pedersen, “Improved privacy in wallets with observers”. [59] R. Cramer, R. Gennaro and B. Schoenmakers, “A Secure and Optimally Efficient Multi-

Authority Election Scheme”, Preliminary version appears in Procceedings of EUROCRYPT '97,1997.

[60] R. Cramer, M. Franklin, B. Schoenmakers and M. Yung, “Multi-authority secrt ballot electionswith linear work”, In Advances in Cryptology-EUROCRYPT '96, volume 1070 of Lecture Notesin Computer Science, pages 72-83, Berlin, 1996. Springer-Verlag.

[61] G.I. Davida, Y. Frankel, Y. Tsiounis and M. Yung, “Anonymity Control in E-Cash Systems”,Financial Cryptography 97.

[62] Y. Desmedt and K. Kurosawa, “How to Break a Practical MIX and Design a New One”,Advances in Cryptology – Eurocrypt 2000, Springer-Verlag LNCS 1807, pp. 557 ff, 2000

[63] L. Detweiller, “Identity, Privacy and Anonymity on the Internet”, 1993,http://www.eserver.org/Internet/Identity-Privacy-Anonymity.txt .

[64] B. De Win, “On the anonymity of electronic cash”

[65] Directive 2000/31/EC of the EU Parliament and of the Council of 8 June 2000 on certain legalaspects of information society services, in particular electronic commerce, in the InternalMarket (Directive on electronic commerce), OJ, 17.07.2000, L178,1.

[66] Directive 98/48/EC of the EU Parliament and of the Council of 20 July 1998 modifying theDirective 98/34/EC of the 22 June 1998 laying down a procedure for the provision ofinformation in the field of technical standards and regulations, OJ 1998, 217,18.

[67] Directive 1999/93/EC of the EU Parliament and of the Council of 13 December 1999 on aCommunity framework for electronic signatures, OJ, 19.01.2000, L13/128.

[68] J. Dumortier, “Elektronische handel en consumentenbescherming in de Europeseontwerprichtlijn en het Belgisch recht”, Computerrecht 1999/3 ,124.

[69] A. Engelfriet, “Anonymity and privacy on the internet”. [70] A. Fasbender, D. Kesdogan and O. Kubitz. “Analysis of Security and Privacy in Mobile IP”, 4

th International Conference on Telecommunication Systems Modeling and Analysis, Nashville,March 1996.

[71] H. Feistel, “Cryptography and computer privacy”, Scientific American, vol. 228, pp. 15--23,1973.

[72] N. Ferguson, “Single Term Off-line Coins”, in Advances in Cryptology - EUROCRYPT '93,volume 765 of Lecture Notes in Computer Science, pages 318-328, Berlin, 1994. Springer-Verlag.

[73] P. Ferragina, A. Monti and A. Roncato, “Trade-off between Computation Power and CommonKnowledge in Anonymous Rings”, PreProceedings of Colloquium on Structural Informationand Communication Complexity, May 16-18 (1994), Ottawa, Canada.

[74] P. Fouque, G. Poupard and J. Stern, “Sharing Decryption in the Context of Voting or Lotteries”,Proceedings of Financial Crypto 2000.

[75] Y. Frankel, Y. Tsiounis and M. Yung, “Indirect Discourse Proofs: Achieving Efficient Fair Off-Line E-Cash”, Advances in Cryptology - Proceedings of Asiacrypt 96.

[76] M. Franklin and M. Yung, “Towards Provably Secure Efficient Electronic Cash”, ColumbiaUniv. Dept. of C.S. TR CUCS-018-92, April 24, 1992.

[77] M. Franklin and M. Yung, “Blind Weak Signatures and its Applications: Putting Non-Cryptographic Secure Computation to Work”, Advances in Cryptology - Proceedings of

Page 68: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 68

Eurocrypt '94. [78] M. Franklin, Z. Galil, and M. Yung, “Eavesdropping games: A graph-theoretic approach to

privacy in distributed systems”. In Proceedings of the 34th Annual IEEE Symposium onFoundations of Computer Science, pages 670--679, 1993.

[79] M. Franklin and M. Reiter, “The design and implementation of a secure auction service”, IEEETransactions on Software Engineering 22(5), May 1996, pp. 302-312

[80] A. M. Froomkin, “The Essential Role of Trusted Third Parties in Electronic Commerce”,October 14 1996.

[81] A. M. Froomkin, “Anonymity and its Enmities”, Journal of Online Law, art. 4, 1995. [82] A. M. Froomkin, “Flood control on the information ocean: Living with anonymity, digital cash,

and distributed databases”, U. Pittsburgh Journal of Law and Commerce, 395(15), 1996. [83] A. Fujioka, T. Okamoto and K. Ohta, “A practical secret voting scheme for large scale

elections”, LNCS 718, Proc. Auscrypt '92, Springer, pp. 244--260. [84] E. Gabber, P. Gibbons, Y. Matias, and A. Mayer. “How to Make Personalized Web Browsing

Simple, Secure, and Anonymous”, Financial Cryptography '97 , February 1997. [85] M. Genesereth, N. Singh and M. Syed, “A Distributed Anonymous Knowledge Sharing

Approach to Software Interoperation”, in Proceedings of the International Symposium on FifthGeneration Computing Systems, 1994, pp. 125-139.

[86] L. Gia, “Addressing anonymous messages in cyberspace”. Journal of Computer MediatedCommunication , 2(1), 1996.

[87] B. Goddyn, “Defining anonymity and its dimensions in the electronic world”, March 2001,http://anonymityontheinternet.cjb.net

[88] I. Goldberg, D. Wagner and E. Brewer, “Privacy-enhancing Technologies for the Internet”.Proc. of IEEE Spring COMPCON , 1997

[89] I. Goldberg and D. Wagner, “TAZ Servers and the Rewebber Network Enabling AnonymousPublishing on the World Wide Web”, University of California, Berkeley, 1997.

[90] D. Goldschlag, M. Reed and P. Syverson, “Privacy on the Internet”, INET '97, Kuala Lumpur,June 1997.

[91] D. Goldschlag, M. Reed and P. Syverson, “Onion Routing for Anonymous and Private InternetConnections”, Communications of the ACM, vol. 42, num. 2, February 1999

[92] J. Gray and A. Reuter, “Transaction Processing: Conceps and Techniques”, Morgan KaufmannPublishers, San Francisco (CA), 1993.

[93] M. Harkavy, J. Tygar and H. Kikuchi, “Electronic Auctions with Private Bids”, 3rd USENIXWorkshop on Electronic Commerce, Boston, Mass., September 1998, pp. 61-73

[94] R. C. Hauser, “Using the Internet to decrease Software Piracy - on Anonymous Receipts,Anonymous ID Cards, and Anonymous Vouchers”, In INET'95 The 5th Annual Conference ofthe Internet Society The Internet: Towards Global Information Infrastructure, volume 1, pages199--204, Honolulu, Hawaii, USA, June 1995.

[95] B. Hayes, “Anonymous one-time signatures and flexible untraceable electronic cash”. InAdvances in Cryptology -- Proceedings of AUSCRYPT'90 (LNCS 453), pages 294--305.Springer-Verlag, 1990.

[96] D. P. Helmbold, C. E. McDowell, and J.-Z. Wang, “Analyzing Traces with AnonymousSynchronization”, Proceedings of the 1990 International Conference on Parallel Processing,pp. 70-77, St. Charles, IL, August 1990.

[97] M. Hennessy and J. Riely, “Type-safe execution of mobile agents in anonymous networks”,Computer Science Technical Report 3/98, University of Sussex, 1998.

[98] M. Herschberg, “Secure electronic voting over the World Wide Web”. Master thesis,Massachusetts Institute of Technonogy -- Laboratory of Computer Science, Cambridge, MA,May 1997.

[99] P.Horster, M.Michels and H.Petersen, “Blind Multisignatures and their relevance for electronicvoting”, Proc. 11th Annual Computer Security Applications Conference, IEEE--Press, (1995),pp. 149—156.

[100] I. Jackson, “Anonymous Addresses and Confidentiality of Location”, in Information Hiding,Ross Anderson ed., Springer-Verlag, LNCS vol. 1174, June 1996, pages 115--120.

[101] I. W. Jackson, “Who goes here? Confidentiality of location through anonymity”. PhD thesis,University of Cambridge, February 1998. http://www.chiark. greenend.org.uk/~ijackson/thesis/.

Page 69: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 69

[102] M. Jakobsson and D. M'Raïhi, “Mix-based Electronic Payments”, Fifth Annual Workshop onSelected Areas in Cryptography (SAC '98) , Queen's University, Kingston, Ontario, Canada,August 17-18, 1998

[103] M. Jakobsson, “‘A Practical Mix’, Theory and Application of Cryptographic Techniques”,Springer-Verlag LNCS 1403, pp 448-461, 1998

[104] M. Jakobsson, “Flash Mixing”, Symposium on Principles of Distributed Computing, pp 83-89,1999

[105] M. Jakobsson and M. Yung, “Revokable and Versatile Electronic Money”, Third ACMConference on Computer and Communications Security, 1996.

[106] M. Jakobsson, M. Yung, “Applying Anti-Trust Policies to Increase Trust in a Versatile E-Money System”, Financial Cryptography '97.

[107] M. Jakobsson, “Privacy vs. Anonymity,” Ph.D. Thesis, University of California, San Diego,1997. Available at www.bell-labs.com/markusj/.

[108] M. Jakobsson, “Ripping Coins for a Fair Exchange”, In Louis C. Guillou and Jean-JaquesQuisquater, editors, Advances in Cryptology: Eurocrypt 95 Proceedings, volume 921 of LectureNotes in Computer Science. Springer-Verlag, 1995.

[109] A. Jerichow, J. Muller, A. Pfitzmann, B. Pfitzmann and M. Waidner, “RealTime Mixes: ABandwidth-Efficient Anonymity Protocol”, IEEE Journal on Selected Areas in Communications16/4 (1998) 495--509.

[110] B. Kaliski, “Privacy Enhancement for Internet Electronic Mail: Part IV: Key Certification andRelated Services”, RFC 1424, February 1993.

[111] S. Kent, “Internet Privacy-Enhanced Mail”, Communications of the ACM, 36(8):48--60, 1993. [112] S. Kent and J. Linn, “Privacy Enhancement for Internet Electronic Mail: Part II -- Certificate -

Based Key Management”, RFC-1114 (Aug. 1989). [113] E. Kesdogan and Bschkes, “Stop and go mixes : Providing probabilistic anonymity in an open

system”, In 1998 Information Hiding Workshop. [114] H. Kikuchi, M. Harkavy and J. Tygar, “Multi-round Anonymous Auction Protocols”, TIEICE:

IEICE Transactions on Communications/Electronics/ Information and Systems, 1999 [115] S. Kolletzki, “Secure internet banking with privacy enhanced mail - a protocol for reliable

exchange of secured order forms (BAKO)”, In Computer Networks and ISDN Systems 28, pages1891 -- 1899. JENC7 - Budapest, Elsevier, 1996.

[116] M. Korkea-Aho, “Anonymity and Privacy in the Electronic World”, 1999,http://www.hut.fi/~mkorkeaa/doc/anonpriv.html.

[117] D. M. Kristol, S. H. Low and N. F. Maxemchuk, “Anonymous internet mercantile protocol”,1994. Submitted.

[118] M. Kumar and S. Feldman, “Internet Auctions”, 3rd USENIX Workshop on ElectronicCommerce, Boston, Mass. September 1998, pp. 49-60

[119] E. Kushilevitz, “Privacy and Communication Complexity”, FOCS 89, and SIAM Jour. on Disc.Math., Vol. 5, No. 2, pp. 273--284, 1992.

[120] L. Law, S. Sabett and J. Solinas, “How to make a mint: the cryptography of anonymouselectronic cash”, National Security Agency, Office of Information Security Research andTechnology, Cryptology Division, June 1996.

[121] A. Lee, “Anonymous collaboration: An alternative technique for working together”, SIGCHIBulletin, 26(3), July 1994.

[122] S. H. Low, N. F. Maxemchuk and S. Paul, “Anonymous Credit Cards”, Technical Report,AT&T Bell Laboratoies, 1993. Submitted to IEEE Symposium on Security and Privacy, 1993.

[123] S. H. Low, N. F. Maxemchuk and S. Paul, “Anonymous credit cards and its collusion analysis”,Technical report, ATT Bell Laboratories, Murray Hill, NJ 07974, October 1994.

[124] S. H. Low, N. F. Maxemchuk and S. Paul, “Analysis of Anonymous Credit Cards”. [125] S. H. Low, N. F. Maxemchuk and S. Paul, “Collusion in a multiparty communication protocol

for anonymous credit cards”, Submitted to IEEE/ACM Transactions on Networking. [126] MasterCard Int., Visa Int., SET, Secure Electronic Transaction. http://www.setco.org. [127] D. Martin Jr., “Local Anonymity in the Internet”, Ph.D. Dissertation, Boston University, 1999. [128] U. M. Maurer and S. Wolf, “Privacy amplification secure against active adversaries”, 1997. [129] G. Medvinsky, B. C. Neuman, “NetCash: A Design for Practical Electronic Currency on the

Internet”, In ACM-CCS'93, 1993.

Page 70: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 70

[130] M. Michels and P. Horster, “Some remarks on a receipt-free and universally veriable mixtypevoting scheme”, In Advances in Cryptology - ASIACRYPT '94, volume 1163 of Lecture Notes inComputer Science, pages 125-132, Berlin, 1996. Springer-Verlag.

[131] A.R. Miller, “Personal Privacy in the Computer Age: The Challenge of New Technology in anInformation-Oriented Society”, Michigan Law Review 67: 1224-25. A Network-Centric Designfor Relationship-Based Rights Management 110

[132] D. M'Raïhi, “Cost-Effective Payment Schemes with Privacy Regulation”, Advances inCryptology - Proceedings of Asiacrypt '96.

[133] D. M'Raïhi and D. Pointcheval, “Distributed Trustees and Revocability: a Framework forInternet Payment”, In Financial Cryptography '98, Springer-Verlag, LNCS 1465, 1998.

[134] T. Nakanishi, N. Haruna and Y. Sugiyama, “Unlinkable Electronic Coupon Protocol withAnonymity Control”, Proc. of Second International Information Security Workshop, ISW'99,LNCS 1729, Springer--Verlag, pp. 37--46 (1999).

[135] M. Naor, B. Pinkas and R. Sumner, “Privacy preserving auctions and mechanism design” [136] NetCash, “NetCash(SM): Anonymous Network Payment”,

http://nii-server.isi.edu:80/info/netcash/, 1995. [137] P. G. Neumann, “Security criteria for electronic voting”, In Proceedings of the 16th National

Computer Security Conference (1993) , pp. 478--481. Baltimore, Maryland, September 20--23.16 Lorrie Faith Cranor

[138] V. Niemi and A. Renvall, “How to prevent buying of notes in computer elections”, In Advancesin Cryptology - ASIACRYPT '94, volume 739 of Lecture Notes in Computer Science, pages 141-148, Berlin, 1994. Springer-Verlag.

[139] H. Nissenbaum, “The meaning of anonymity in an information age”, The Information Society,15(2), 1999.

[140] W. Ogata, K. Kurosawa, K. Sako and K. Takatani, “Fault Tolerant Anonymous Channel”,Information and Communications Security '97 .

[141] K. Oishi, M. Mambo, and E. Okamoto, “Anonymous public key certificates and theapplications”, IEICE Transactions on Fundamentals of Electonics, Communications, andComputer Sciences, E81-A(1):56--64, Jan. 1998.

[142] T. Okamoto, K. Ohta, “Universal electronic Cash”, Crypto '91 (Lecture Notes in ComputerScience), pages 338-350. Springer-Verlag, 1992.

[143] T. Okamoto, “Receipt-free electronic voting schemes for large scale elections”, In theProceedings of Security Protocols Workshop '97, pages 25--35, Paris, 1997.

[144] T. Okamoto, “An Electronic Voting Scheme”, Proc. of IFIP'96, Advanced IT Tools, Chapman& Hall, pp.21--30 (1996).

[145] T. Okamoto and K. Ohta, “One-Time Zero-Knowledge Authentications and Their Applicationsto Untraceable Electronic Cash”, IEICE Transactions on Fundamentals of Electronics,Communications and Computer Sciences, Vol. E81-A, No. 1, pp. 2-10 (1998).

[146] C. Park, K. Itoh, and K. Kurosawa, “Efficient anonymous channel and all/nothing electionscheme”. In Advances in Cryptology---EUROCRYPT '93, volume 765 of Lecture Notes inComputer Science, pages 248--259, Berlin, 1994. Springer-Verlag.

[147] T. Pedersen, “A threshold cryptosystem without a trusted party”, In Advances in Cryptology---EUROCRYPT '91, volume 547 of Lecture Notes in Computer Science, pages 522-526, Berlin,1991. Springer-Verlag.

[148] B. Pfitzmann and M. Waidner, “How to Break and Repair a 'Provably Secure' UntraceablePayment System”, Proc CRYPTO '91, pp.394-337.

[149] B. Pfitzmann, M. Waidner and A. Pfitzmann, “Secure and Anonymous Electronic Commerce:Providing Legal Certainty in Open Digital Systems Without Compromising Anonymity”, IBMResearch Report RZ 3232 (#93278) 05/22/00, IBM Research Division, Zurich, May 2000.

[150] B. Pfitzmann and M. Waidner, “Unconditionally Untraceable and Fault-tolerant Broadcast andSecret Ballot Election”, Hildesheimer Informatik-Berichte 3/92, ISSN 0941-3014, University ofHildesheim, Institut für Informatik, 1992.

[151] B. Pfitzmann and M. Waidner, “Anonymous fingerprinting”, In EuroCrypt '97, LNCS, Berlin,1997. Springer-Verlag.

[152] B. Pfitzmann, “Breaking an efficient anonymous channel”, In Advances in Cryptology---EUROCRYPT '94, volume 950 of Lecture Notes in Computer Science, pages 332--340, Berlin,

Page 71: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 71

1995. Springer-Verlag. [153] B. Pfitzmann and M. Waidner, “Strong Loss Tolerance for Untraceable Electronic Coin

Systems”, Hildesheimer Informatik-Berichte 15/95, (1995), 44 pages. [154] A. Pfitzmann, B. Pfitzmann and M. Waidner, “ISDNMixes: Untraceable Communication with

Very Small Bandwidth Overhead,” GI/ITG Conference: Communication in Distributed Systems,Mannheim Feb. 20-- 22 1991, Informatik-Fachberichte 267, Springer-Verlag, Heildelberg 1991,pp. 451--463.

[155] C. Pfleeger and D. Cooper, “Security and Privacy: Promising Advances”, IEEE Software, pages27 -- 32, September/October 1997.

[156] Proton World Int. http://www.protonworld.com. [157] C. Rackoff and D. Simon, “Cryptographic Defense Against Traffic Analysis”, Proc. 25th ACM

Symp. on the Theory of Computation (1993) , pp. 672-681. [158] C. Radu, “Analysis and Design of Off-Line Electronic Payment Systems”, PhD. Thesis, COSIC,

K.U. Leuven, 1997. [159] C. Radu, R. Govaerts and J. Vandewalle, “Efficient Electronic Cash with Restricted Privacy”,

Proc. Financial Cryptography Workshop, (1997), 8 pages. [160] M. G. Reed, P. F. Syverson, and D. M. Goldschlag, “Proxies for Anonymous Routing”, Proc.

12th Annual Computer Security Applications Conference, San Diego, CA, IEEE CS Press,December, 1996, pp. 95--104.

[161] M. Reed, P. Syverson, and D. Goldschlag, “Protocols using Anonymous Connections: MobileApplications”, Security Protocols Fifth International Workshop , LNCS vol. 1361, Springer-Verlag, pp. 13-23April 1997.

[162] M. K. Reiter and A. D Rubin, “Crowds: Anonymity for Web Transactions”, DIMACSTechnical Report 97-15, 1997.

[163] A. Riera, J. Borrell and J. Rifà, “An Uncoercible Verifiable Electronic Voting Protocol”, IFIPSEC'98.

[164] A. Riera, “An introduction to electronic voting systems”. [165] R. Rivest and A. Shamir, “PayWord and MicroMint: Two Simple Micropayment Schemes”,

1996 [166] H. van Rossum, H. Gardeniers, J. Borking et al., “Privacy-Enhancing Technologies: The Path to

Anonymity”, Volume I u. II; Achtergrondstudies en Verkenningen 5a/5b; Registratiekamer, TheNetherlands & Information and Privacy Commissioner/Ontario, Canada; August 1995;http://www.ipc.on.ca/english/pubpres/sum_pap/papers/anon-e.htm.

[167] J. Rothfeder, “Privacy for sale: How computerization has made everyone's private life an opensecret”, Simon and Schuster: New York, 1992.

[168] SafeVote website: http://safevote.com [169] Safevote (Ed Gerk, editor), “Voting system requirements”, The Bell, Vol. 1 No. 7, p.3,

November 2000. [170] K. Sako and J. Kilian, “Receipt-Free Mix-Type Voting Scheme”, Eurocrypt 1995, pp. 393-403. [171] K. Sako, “Electronic voting scheme allowing open objection to the tally”, IEICE Transactions

on Fundamentals of Electronics, Communications and Computer Sciences, E77-A(1):24--30,January 1994.

[172] P. Samarati and L. Sweeney, “Generalizing data to provide anonymity when disclosinginformation”, In ACM Principles of Database Systems, Seattle, Washington, USA, June 1998.

[173] D. Samfat and R. Molva, “A method providing identity privacy to mobile users duringauthentication”, Proceedings of the Workshop on Mobile Computing Systems and Applications,December 1994.

[174] D. Samfat, R. Molva and N. Asokan, “Untraceability in mobile networks”, in Proceedings ofMobiCom '95.

[175] T. Sander and A. Ta-Shma, “Auditable, anonymous electronic cash”, In Crypto 99. [176] T. Sander and A. Ta-Shma, “A new approach for anonymity control in electronic cash systems”,

In Financial Cryptography , 1999. [177] D. Sayer, “The Erosion of Privacy and Security in Public Telecommunication Networks:

Growing Significance of Telemetadata in Advanced Communication Services”, EuropeanCommission's FAIR project, Working Paper No. 13, Sussex, 1997.

[178] B. Schneier and J. Riordan, “A Certified E-Mail Protocol”.

Page 72: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 72

[179] B. Schoenmakers, “A simple publicly verifiable secret sharing scheme and its application toelectronic voting”, Advances of Cryptology – CRYPTO '99, Springer-Verlag, 1999

[180] B. Schoenmakers, “Basic Security of the ecashTM Payment System”, In Computer Security andIndustrial Cryptography: State of the Art and Evolution, ESAT-COSIC course, Leuven,Belgium, 1997. Springer-Verlag, LNCS 1528, 1998.

[181] B. Schoenmakers, “A simple publicly veriable secret sharing scheme and its application toelectronic voting”. In Advances in Cryptology - CRYPTO '99, volume 1666 of Lecture Notes inComputer Science, pages 148-164, Berlin, 1999. SpringerVerlag.

[182] C. Shields and B. Levine, “A Protocol for Anonymous Communication Over the Internet”,ACM Conference on Computer and Communication Security, Athens, Greece, November 1-4,2000

[183] D. R. Simon, “Anonymous Communication and Anonymous Cash”, In Crypto'96 , Springer-Verlag, LNCS 1109, 1996.

[184] S. von Solms and D. Naccache, “On Blind Signatur es and Perfect Crimes”, Computers andSecurity, 11 (1992)

[185] M. Stadler, “Publicly verifiable secret sharing”, Advances in Cryptology – EUROCRYPT ' 96,Springer-Verlag, Ueli Maurer, ed., pp. 190-199, 1996.

[186] M. Stadler, J.-M. Piveteau, J. Camenish, “Fair Blind Signatures”, Advances in Cryptology,EUROCRYPT '95, pages 209-219. Springer-Verlag, 1995.

[187] M. Stadler, “Cryptographic protocols for revocable privacy”, Dissertation, ETH Zurich, (1996),111 pages.

[188] T. Stanley, “Electronic Communications Privacy Rights”, CPSR Civil Liberties Project.URL: http://www-leland.stanford.edu/~tstanley/cpsrart.html.

[189] P. A. Strassmann and W. Marlow, “Risk-free access into the global information infrastructurevia anonymous re-mailers”, In Symposium on the Global Information Infrastructure,Cambridge, MA, January 28--30 1996.

[190] S. Stubblebine and P. Syverson, “Fair on-line auctions without special trusted parties”, Proc. ofFinancial Cryptography '99, 1999

[191] P. Syverson and S. Stubblebine, “Principals and the Formalization of Anonymity”, in FM'99Formal Methods, Vol. I , J.M. Wing, J. Woodcock, and J. Davies (eds.), Springer-Verlag,LNCS vol. 1708, pp. 814-833, 1999.

[192] P. Syverson, D. Goldschlag and M. Reed, “Anonymous Connections and Onion Routing”,IEEE Symposium on Security and Privacy, 1997.

[193] P. Syverson, S. Stubblebine and D. Goldschlag, “Unlinkable Serial Transactions”, FinancialCryptography 1997, LNCS Vol. 1318, Springer-Verlag, 1997.

[194] P. Syverson, D. Goldschlag and M. Reed, “Anonymous Connections and Onion Routing”, IEEEJournal on Selected Areas in Communications, vol. 16 no. 4, May 1998, pp. 482-494

[195] J. D. Tygar, “Atomicity versus anonymity: Distributed transactions for electronic commerce”,In Proc. VLDB , pages 1--12, 1998.

[196] K. Viswanathan, C. Boyd and Ed Dawson, “A three phased schema for sealed bid auctionsystem design”

[197] M. Waldman, A.D.Rubin and L.F.Cranor, “Publius : A robust, tamper-evident, censorship-resistant web publishing system”.

[198] M. Waidner and B. Pfitzmann, “The dining cryptographers in the disco: Unconditional senderand recipient untraceability with computationally secure serviceability”, In Eurocrypt '89,volume Lecture Notes in Computer Science of 434. Springer-Verlag, 1990.

[199] M. Waidner, “Unconditional sender and recipient untraceability in spite of active attacks”, InEurocrypt '89, volume Lecture Notes in Computer Science of 434, pages 302--319. Springer-Verlag, 1989

[200] D. Wilson, “Cash Subdividable into Unlinkable Pieces”, Unpublished manuscript. [201] M. Winslett, N. Ching, V. Jones and I. Slepchin, “Assuring security and privacy for digital

library transactions on the web: client and server security policies”, Proceedings of ADL '97 - -Forum on Research and Technology Advances in Digital Libraries, Washington, DC, May1997.

[202] http://anon.efga.org. [203] http://votehere.net, VoteHere website

Page 73: APES Anonymity and Privacy in Electronic Services · APES Deliverable D2 – Requirement study [FINAL VERSION] 3 Executive summary For applications such as electronic voting and electronic

APES Deliverable D2 – Requirement study [FINAL VERSION] 73

[204] http://www. research.att.com/projects/crowds/manual.html Crowds user manual. [205] http://www.aaas.org/spp/anon -AAAS "Anonymous Communications on the internet". [206] http://www.agorics.com/ - Agoris inc. [207] http://www.anonymizer.com/ -The Anonymizer website. [208] http://www.bell-labs.com/project/lpwa/ -about the Lucent Personalized Web Assistant. [209] http://www.cs.berkeley.edu/~daw/ -homepage of David Wagner. [210] http://www.eskimo.com/~weidai/ -homepage of Wei Dai. [211] http://www.inf.tu-dresden.de/~hf2/anon/index.html -Privacy and Anonimity on Internet. [212] http://www.kulak.ac.be/~vnaessen/ -presentation about anonymity (by Vincent Naessens). [213] http://www.lpwa.com/ -The Lucent Personalized Web Assistant website [214] http://www.mailanon.com/ -the Anonymous E-mail Remailer website [215] http://www.research.att.com/projects/crowds/slides/index.htm -presentation about Crowds. [216] http://www.research.att.com/projects/crowds/spec.txt -Crowds communication specification. [217] http://www.research-att.com/ [218] http://www.safevote.com, Ed Gerk, “Voting System Requirements”