Security matters : privacy in voting and fairness in digital exchange

H.L. Jonker

Research output: ThesisPhd Thesis 1 (Research TU/e / Graduation TU/e)

94 Downloads (Pure)

Abstract

Security matters. In the real world, there are various basic mechanisms that provide security. The digital counterparts of these basic mechanisms have been studied and are now well understood. Understanding of more complex security notions has yet to mature. This thesis studies two complex security notions: privacy in voting and fairness in digital exchange. To facilitate the two different domains, the thesis discusses each notion in its own part. Part I: Privacy in Voting Voting systems have been evolving for years, incorporating new procedures and new countermeasures to detected problems. When the field of voting emerged in the 80s as a research area in computer science, it was not straightforward to transform these procedures and countermeasures into clearly specified, exact requirements necessary in computer science. Thus, requirements for voting systems had to be rediscovered in computer science over the years. Without a high-level overview of the field, it remains unclear whether the set of requirements covers all desired properties. To this end, we analysed the domain of voting (cf. Chapter 2). The analysis resulted in a set of high-level security requirements for voting systems. Amongst these is the requirement All voters are free to cast their votes. Privacy is an essential ingredient to ensure this: if no one can determine how a voter voted, no one can hold that voter accountable for her vote. Ensuring privacy thus prevents any restrictions on freedom targeted to specific voters. Unfortunately, the concept of privacy is not well understood in the field of voting. There are various terms which all cover an aspect of privacy. We set out to investigate and improve the understanding of privacy in voting. To investigate current understanding of privacy in voting, we surveyed a number of voting system proposals from literature (cf. Chapter 3). We found that while every system claims some form of privacy, several of these claims were dubious and, in several cases, these claims were broken by later works. In addition, the way privacy is specified varies from system to system. There is no unique notion of privacy in voting. This lack of consensus became more apparent when the risk of vote selling was introduced. In vote selling, privacy prevents a voter from proving how she voted, which means that selling her vote is impossible. Thus, a vote seller will actively try to reduce her privacy. Voting systems should prevent voters from doing so. However, failing a consensus on privacy, it is unclear how to expand the notion of privacy to incorporate vote selling. Thus, it is unclear how to verify that systems enforce sufficient privacy to prevent vote selling. This necessitates a new understanding of privacy. To this end, we introduce a formal framework that determines precisely for whom a voter could have voted (cf. Chapter 4). By comparing a regular voter with a voter who tries to reduce her privacy, we can determine if a voter can reduce her privacy and if so, by how much. In this fashion, the privacy of a voter was quantified. Several existing notions of privacy were easily captured in this way. But the preciseness of the framework extends beyond that: it also can detect partial loss of privacy. In order to show how the framework and the ideas it embodies improve understanding of voting, we apply the framework and its concepts in Chapter 5. The framework is used to ascertain the privacy of a purely theoretical scheme (FOO92), and of an actually implemented voting system (Prˆet `a Voter), and the concept of choice groups is used to discuss the relation between privacy and verifiability in the 3BS voting system. Using the framework, we prove that FOO92 offers privacy to voters, but it does not enforce it: vote selling is possible. The framework also clarifies the reasons for this, and based on this indication, we sketch a possible approach for improving privacy of FOO92. The Prˆet `a Voter voting system uses paper ballots and a voting booth. With the help of the framework, the impact of these physical measures on privacy is made explicit. Thus, we can reason about adapting Prˆet `a Voter for remote voting, where such measures are not necessarily possible. Finally, the 3BS system uses a novel way to provide a trade off between privacy and verifiability. Using the concept of choice groups, we gain a better understanding of how much privacy is sacrificed. Part II: Ftirness in Digital Exchange There exist various forms of controlling access to items. This is for example necessary in trade – without scarcity of items, there is no need to trade for them. With the advent of the Internet, a large-scale distributed system, the question of how to enforce access control to digital items in a distributed environment emerged, as a way to enable online trade. One answer was to distribute access control together with the controlled items. This is what Digital Rights Management (DRM) sets out to achieve. As such, DRM is to enable online trade in digital items. Again, a high-level view of the field is necessary to obtain some certainty that the set of requirements for DRM systems covers all desired aspects. In Chapter 6, the domain of DRM is analysed, resulting in a set of security requirements. We highlight two requirements: the impact of breaks is constrained and fair exchange. The fair exchange of two items a and b between two parties ensures that either a and b both switch owners, or neither does. In the usual client-server (client-seller) setting of DRM systems, fair exchange is only ensured for the seller: the seller receives the (assurance of) payment and then proceeds to deliver. Recently, the idea of DRM has emerged in other settings, such as a peer-to-peer (client-to-client) setting. In such a setting, guaranteeing fair exchange for both seller and buyer becomes an important prerequisite for trading. In Chapter 7, we develop Nuovo DRM. Nuovo DRM is a secure DRM system that enables fair exchange and constrains the impact of breaks. As claims of security must be substantiated, we do so using formal methodology. We formalise the goals of Nuovo DRM and verify (using model checking) that these goals are achieved by an abstract model of Nuovo DRM. In addition to that, we present several procedures that mitigate the effects of breaks.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • Department of Mathematics and Computer Science
Supervisors/Advisors
  • Mauw, Sjouke, Promotor
  • Baeten, Jos, Promotor
  • Pang, J., Copromotor, External person
Award date25 Aug 2009
Place of PublicationEindhoven
Publisher
Print ISBNs978-90-386-1906-4
DOIs
Publication statusPublished - 2009

Fingerprint Dive into the research topics of 'Security matters : privacy in voting and fairness in digital exchange'. Together they form a unique fingerprint.

  • Cite this