Article Preview
TopIntroduction
The depth and scope of digital interconnectedness in contemporary commercial, economic, cultural, and social networks is such that collecting individuals’ information is deemed, or expected, to affect the autonomy enjoyed by individuals and the scope or depth of profiles created about them (Kaltheuner, 2021; Boshell, 2019; Zhang, Wang, Karahanna, and Xu, 2022; Cheng, Su, Luo, Benitez, and Cai, 2022). Yet, despite the significance of individuals’ autonomy, considered as a right in many societies, and increasing concerns about profiling individuals based on this pervasive data collection, prior Information Privacy Concerns (IPC) models (see Zhang et al., 2022; Hong and Thong, 2013; Xu et at., 2012) do not articulate autonomy or profiling concerns as integral constituents of the IPC. This study examines the role and place of two constructs, termed “loss of autonomy” (i.e., autonomy) and “control over profiling” (i.e., profiling), within the IPC via three studies. The first study explores autonomy and profiling as first-order constructs of IPC and develops the associated survey instrument items for those constructs. The second empirically assesses these two new first-order constructs to affirm their scale properties when they are operationalized within the broader IPC models. The third is a cross-validation study aimed at assessing the generalizability of the proposed integrated IPC model. We find that autonomy and profiling form a new IPC dimension named self-focused concerns, resulting in an IPC model with three dimensions: self-focused concerns, data-focused concerns, and device-focused concerns. We propose this resultant model as an integrated IPC model.
A Pew Research Center study found a continued prevalence of information privacy concerns. Nearly 81% of Americans felt that they had little control over data collected about them (Auxier et al., 2019). While only about 30% of Americans report using contemporary online dating apps, the majority of them, 60%, report being concerned about how much data these apps and websites collect about them (Turner and Anderson, 2020). Another study found that “63% of customers say most companies fail to use their data transparently, and 54% believe most companies don’t use their data in a way that benefits them” (Donegan, 2019). As Hong and Thong (2013) have argued, privacy concerns go beyond unauthorized access, secondary use, and misuse. In the literature, two main antecedents of privacy concerns have been noted as perceived potential risk and loss of control (Dinev et al., 2004). Several scholars, however, have argued that perceptions of risk and control are intricately related to distrust in organizations (Libaque-Saenz et al., 2016; Malhotra et al., 2004). We argue that trust in an organization is a self-focused concern. As Kaltheuner (2021) states, “Privacy isn’t the opposite of connecting and sharing — it’s fundamentally about human dignity and autonomy.” Therefore, as expressed by Becker (2019), “The right to privacy is essentially the right of individuals to have their own domain, separated from the public.” Consequently, we adopt Cohen’s (2012) conceptualization of privacy: “Privacy [is about] the boundary conditions between self and society, and the ways that those conditions mediate processes of self-formation.” Therefore, we define self-focused information privacy concerns as those concerns that touch on the individual’s sense of autonomy, self-determination, identity, and dignity. Self-focused information privacy entails those concerns that ill-affect an individual’s sense of operating within the world as an independent rational being. Hence, issues such as loss of autonomy and concerns about profiling take prominence.