Consumer Finance and AI: The Death of Second Opinions?

59 Pages Posted: 15 Apr 2019 Last revised: 20 Dec 2020

See all articles by Nizan Geslevich Packin

Nizan Geslevich Packin

University of Haifa - Faculty of Law; City University of NY, Baruch College, Zicklin School of Business; City University of New York (CUNY) - Department of Law

Date Written: March 28, 2019

Abstract

In today’s world, people routinely rely on the advice of algorithms for all aspects of their lives, from mundane tasks like choosing the most efficient navigation route home, to significant financial decisions regarding how to invest their retirement savings. Because of the ubiquity of algorithms, people have become increasingly comfortable relying on those algorithms—a tendency known as automation bias. This Article presents an original empirical study that explores automation bias in the area of consumer finance. The study confirms that when making consumer finance decisions, including making significant investment decisions, Americans significantly prefer following the recommendations of algorithms to those of human experts. Moreover, the study finds that even after poor performance of their investment as a result of following an algorithm’s advice, consumers continue to favor algorithms to human experts, and feel more confident that algorithms give them the better recommendation. This result, demonstrates that we view algorithms—especially those rooted in big data—as a superior authority.

Our increasing deference to algorithmic results is concerning because we are avoiding obtaining “a second opinion”—a term often used in the medical context—even when the first opinion comes from an algorithm that has made mistakes in the past. Although second opinions are costly and may not always be economically efficient, they are important—and even critical—in certain situations. For example, second opinions can be critical for high-stakes decisions, for decisions about which experts disagree and/or include many options, or in situations where the decision-maker does not like the outcome but is unqualified to evaluate the soundness of the first opinion. By reducing the acceptability of seeking second opinions, our algorithm-dependent society is nudging us to tone down human traits such as creativity, innovation, and critical thinking, and instead to blindly rely on the new experts—the algorithms, which are black boxes whose biases are difficult to assess.

Second opinions do not necessarily need to be human-formulated opinions. In the era of big data and AI, different algorithms that are based on dissimilar data and assumptions can offer second opinions and introduce more options to users. In fact, algorithmic second opinions may be more objective than human-formulated second opinions because a human reviewing the first algorithmic opinion is herself affected by automation bias. Given our significant automation bias, great care must be taken to ensure objective human second opinions.

As a conclusion, the Article suggests implementing cultural changes by hyper-nudging users to seek second opinions, including AI-based opinions, and requiring algorithmic auditing.

Suggested Citation

Packin, Nizan Geslevich, Consumer Finance and AI: The Death of Second Opinions? (March 28, 2019). New York University Journal of Legislation and Public Policy (2020), Baruch College Zicklin School of Business Research Paper No. 2019-04-06, Available at SSRN: https://ssrn.com/abstract=3361639

Nizan Geslevich Packin (Contact Author)

University of Haifa - Faculty of Law ( email )

Mount Carmel
Haifa, 31905
Israel

City University of NY, Baruch College, Zicklin School of Business ( email )

One Bernard Baruch Way
New York, NY 10010
United States

City University of New York (CUNY) - Department of Law ( email )

New York, NY
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
500
Abstract Views
3,086
Rank
105,246
PlumX Metrics