Privacy-enhancing technologies (PETs), which constitute a wide array of technical means for protecting users’ privacy, have gained considerable momentum in both academia and industry. However, existing surveys of PETs fail to delineate what sorts of privacy the described technologies enhance, which in turn makes it difficult to differentiate between the various PETs. Moreover, those surveys could not consider very recent important developments with regard to PET solutions. The goal of this chapter is two-fold. First, we provide an analytical framework to differentiate various PETs. This analytical framework consists of high-level privacy principles and concrete privacy concerns. Secondly, we use this framework to evaluate representative up-to-date PETs, specifically with regard to the privacy concerns they address, and how they address them (i.e., what privacy principles they follow). Based on findings of the evaluation, we outline several future research directions.
The Privacy Landscape
Privacy has been studied for decades, and many different definitions of privacy have been proposed. This is largely due to the fact that privacy is “an overwhelmingly large and nebulous concept” (Boyle & Greenberg, 2005). Young (Young, 1978) wittedly commented that “privacy, like an elephant, is … more readily recognized than described”. In essence, privacy is personal, nuanced, dynamic, situated and contingent (Dourish & Anderson, 2006; Palen & Dourish, 2002).
If privacy considerations are taken into account in the design of computer systems, they restrain the possible design space for such systems. Solutions that violate privacy constraints cannot be considered any more. Privacy constraints for computer systems stem primarily from two sources, namely from privacy laws and regulations and from personal privacy expectations of the computer users. Figure 1 shows the hierarchy of these constraints with a focus on privacy laws and regulations.
The hierarchy of potential privacy constraints
Key Terms in this Chapter
XACML: A general-purpose access control language that can be used to describe access control decision requests and responses as well as access control rules and policies.
APPEL: A P3P preference language that allows users to express their privacy preferences.
Anonymity: The property that a user cannot be identified within the total user population, nor her interactions be tracked.
P3P: A machine-readable (XML) language that allows websites to describe their privacy practices and P3P-enabled user agents (e.g., web browsers) to retrieve these privacy policies automatically and potentially analyze them.
Pseudonymity: The property that a user cannot be identified within the total user population, but her interactions nevertheless be tracked.
Identity Management: The management and provisioning of information about users across different applications (sometimes users may entertain different identities with partially different characteristics).
Authentication: A process for verifying the digital identity of users or processes.
Authorization: A process for verifying whether an identified user or role enjoys specific access rights to certain resources.
Complete Chapter List
Manish Gupta, Raj Sharman
C. Warren Axelrod
Ahmed Awad E. Ahmed
Arunabha Mukhopadhyay, Samir Chatterjee, Debashis Saha, Ambuj Mahanti, Samir K. Sadhukhan
Zhixiong Zhang, Xinwen Zhang, Ravi Sandhu
Madhusudhanan Chandrasekaran, Shambhu Upadhyaya
Ghita Kouadri Mostefaoui, Patrick Brézillon
Douglas P. Twitchell
James W. Ragucci, Stefan A. Robila
Nick Pullman, Kevin Streff
E. Kritzinger, S.H von Solms
Donald Murphy, Manish Gupta, H.R. Rao
Sérgio Tenreiro de Magalhães, Kenneth Revett, Henrique M.D. Santos, Leonel Duarte dos Santos, André Oliveira, César Ariza
Carsten Röcker, Carsten Magerkurth, Steve Hinske
Yuko Murayama, Carl Hauser, Natsuko Hikage, Basabi Chakraborty