Determined groups or collectivities by algorithmically extracting subsets or classes of similar individuals based on common habits and characteristics.
Published in Chapter:
What Can Fitness Apps Teach Us About Group Privacy?
Miriam J. Metzger (University of California, Santa Barbara, USA), Jennifer Jiyoung Suh (University of California, Santa Barbara, USA), Scott Reid (University of California, Santa Barbara, USA), and Amr El Abbadi (University of California, Santa Barbara, USA)
Copyright: © 2021
|Pages: 30
DOI: 10.4018/978-1-7998-3487-8.ch001
Abstract
This chapter begins with a case study of Strava, a fitness app that inadvertently exposed sensitive military information even while protecting individual users' information privacy. The case study is analyzed as an example of how recent advances in algorithmic group inference technologies threaten privacy, both for individuals and for groups. It then argues that while individual privacy from big data analytics is well understood, group privacy is not. Results of an experiment to better understand group privacy are presented. Findings show that group and individual privacy are psychologically distinct and uniquely affect people's evaluations, use, and tolerance for a fictitious fitness app. The chapter concludes with a discussion of group-inference technologies ethics and offers recommendations for fitness app designers.