No Cookies For You!: Evaluating The Promises Of Big Tech’s ‘Privacy-Enhancing’ Techniques.
56 Pages Posted: 20 Dec 2023
Date Written: December 9, 2023
Abstract
We identify and examine three common principles underlying a slew of “privacy-enhancing” techniques recently deployed or scheduled for deployment by big tech companies: (1) limiting access to users’ personal data by third parties; (2) minimizing use and retention of users’ raw data, while still exploiting inferences derived from that data; and (3) personal data never leaving users’ device. Our article challenges these principles – not because techniques offered to implement them fail to achieve their stated goals but because the principles themselves are not sufficient to block privacy-violating behavior. Through philosophical analysis and technical scrutiny, we reveal the misalignment between these principles and a sound conception of privacy.
We reinforce our findings empirically, with a series of factorial vignette user studies structured by the conception of privacy as contextual integrity. Vignettes systematically vary the data collected, the actor collecting the data, whether inferences are drawn from it, whether a third party is involved in receiving the data in question, and the purposes for which the data is used. Our studies indicate that people, when making privacy judgments, focus on how information is used for contextual v. non-contextual purposes more than what data was collected or with whom it was shared. In sum, we find:
1. The distinction between first and third parties is not fundamental to respondents’ privacy preferences.
2. Committing to not use ‘raw data’ without at the same time expressing commitments about inferences does not assuage respondents’ privacy concerns.
3. Purpose matters. Respondents’ judgements were sensitive to the purposes for which information is used. For example, using consumer data for targeting online ads is judged a privacy violation whether performed by a first or third party.
4. Respondents rate selling or using inferences more negatively than selling or using raw data. Paradoxically, respondents slightly preferred data brokers to buy raw data rather than the inferences based on that same data (however, both scenarios were negative).
5. The ‘Sandbox’ privacy solution proposed by Google, which draws on all three principles, does not fully address privacy expectations nor does it provide a uniformly superior alternative to third-party trackers and ad networks placing personalized ads.
These findings have implications for policy and practice. First, rules restricting access to third parties are poor proxies for privacy rules. Although they may limit the flow of data to non-contextual actors, such rules overlook questionable practices of first parties acting in non-contextual, inappropriate ways, or, indeed, third parties acting in contextually appropriate ways (e.g. third-party cybersecurity services). Second, law and policy addressing raw data retention and minimization, alone, will miss privacy violations resulting from inappropriate inferences based on that data, even if the data is restricted and/or minimized. Third, our results show that consumers care about the purpose for which data is used. Policies and laws that focus on who collects information and what information is collected, but do not restrict the purposes that these practices serve will fail to address significant privacy-violating behaviors. Rules and regulations that focus on where data is held or where processing takes place (e.g., only on users’ devices) would not address privacy-violating behavior if firms still derive inferences about users from that data and use these inferences in service of inappropriate, non-contextual purposes (e.g. web browsers serving targeted advertising.)
Suggested Citation: Suggested Citation