Predictions and Privacy: Should There Be Rules About Using Personal Data to Forecast the Future?

48 Cumb. L. Rev. 149 (2018)

62 Pages Posted: 6 Jul 2021

See all articles by Hideyuki MATSUMI

Hideyuki MATSUMI

Vrije Universiteit Brussel (VUB); Keio University

Date Written: April 4, 2018

Abstract

Companies now regularly use data to infer all kinds of information about people. These inferences have real consequences for peoples’ lives, yet data privacy law lacks a principled viewpoint to distinguish and discuss the different characteristics and risks of various types of predictions. Whether it is called profiling, inferences drawn, or predictive analytics, privacy and data protection law typically treats all types of predictions equally. As a result, the law fails to appreciate and address the different risks associated with each. This Article provides an analytical framework that will help lawmakers distinguish between the different types of predictions. Specifically, it categorizes predictions along certainty lines. Present predictions are vested. These are guesses about past or present facts, like whether you are married, and the data subject knows or has reasonable means to know whether or not they are true or accurate. Because they are vested, people can dispute inaccurate present predictions. The other type of prediction forecasts an unvested future, like whether you will get divorced within two years. Unlike present predictions, people cannot confront the accuracy of future forecasting because it hasn’t happened yet. Data protection regimes often give data subjects have the right to correct inaccurate information, but how can one say a prediction about the future is inaccurate?

This article concludes that lawmakers should better distinguish between predictions that have already vested and those that have not. Doing so would help lawmakers determine when substantive accuracy should be demanded and when it is impossible. When information has not yet vested, important legal remedies like the right of correction are less effective and a turn to procedural accuracy might be required to fill the gap. But when information can be ascertained as accurate, lawmakers should be less content to reply upon pure proceduralism, preferring substantive accuracy where reasonable. Future forecasting is fundamentally distinct and more dangerous than predictions about present, vested information, because data subjects are less capable to confronting automated forecasting about people’s future behavior. Meanwhile, companies are content to reply upon procedural protections even when forecasting on a speculative future can adversely affect people and communities. With a better analytical framework based around degree of certainly, lawmakers, courts, and scholars can discern and discuss the risks associate with various types of prediction in a more precise way and address the unique challenges raised by future forecasting.

Keywords: privacy, data protection, profiling, inferences drawn, machine learning, predictive analytics, predictions, future forecasting, accuracy

JEL Classification: K24, K29

Suggested Citation

MATSUMI, Hideyuki, Predictions and Privacy: Should There Be Rules About Using Personal Data to Forecast the Future? (April 4, 2018). 48 Cumb. L. Rev. 149 (2018), Available at SSRN: https://ssrn.com/abstract=3222217

Hideyuki MATSUMI (Contact Author)

Vrije Universiteit Brussel (VUB) ( email )

Pleinlaan 2
http://www.vub.ac.be/
Brussels, 1050
Belgium

Keio University

5322 Endo
Fujisawa-shi, Kanagawa 252-0882
Japan

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
92
Abstract Views
756
Rank
417,571
PlumX Metrics