Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users

30 Pages Posted: 25 May 2022

See all articles by Megan A. Brown

Megan A. Brown

Center for Social Media and Politics - NYU

James Bisbee

New York University (NYU) - Department of Politics

Angela Lai

New York University (NYU)

Richard Bonneau

New York University (NYU) - New York University

Jonathan Nagler

NYU - Wilf Family Department of Politics

Joshua A. Tucker

New York University (NYU)

Date Written: May 11, 2022

Abstract

To what extent does the YouTube recommendation algorithm push users into echo chambers, ideologically biased content, or rabbit holes? Despite growing popular concern, recent work suggests that the recommendation algorithm is not pushing users into these echo chambers. However, existing research relies heavily on the use of anonymous data collection that does not account for the personalized nature of the recommendation algorithm. We asked a sample of real users to install a browser extension that downloaded the list of videos they were recommended. We instructed these users to start on an assigned video and then click through 20 sets of recommendations, capturing what they were being shown in real time as they used the platform logged into their real accounts. Using a novel method to estimate the ideology of a YouTube video, we demonstrate that the YouTube recommendation algorithm does, in fact, push real users into mild ideological echo chambers where, by the end of the data collection task, liberals and conservatives received different distributions of recommendations from each other, though this difference is small. While we find evidence that this difference increases the longer the user followed the recommendation algorithm, we do not find evidence that many go down `rabbit holes' that lead them to ideologically extreme content. Finally, we find that YouTube pushes all users, regardless of ideology, towards moderately conservative and an increasingly narrow range of ideological content the longer they follow YouTube's recommendations.

Keywords: YouTube, Recommendation Algorithms, Echo Chambers, Theory Testing

Suggested Citation

Brown, Megan and Bisbee, James and Lai, Angela and Bonneau, Richard and Nagler, Jonathan and Tucker, Joshua Aaron, Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users (May 11, 2022). Available at SSRN: https://ssrn.com/abstract=4114905 or http://dx.doi.org/10.2139/ssrn.4114905

Megan Brown (Contact Author)

Center for Social Media and Politics - NYU ( email )

Bobst Library, E-resource Acquisitions
20 Cooper Square 3rd Floor
New York, NY 10003-711
United States

James Bisbee

New York University (NYU) - Department of Politics ( email )

New York, NY
United States

Angela Lai

New York University (NYU) ( email )

Bobst Library, E-resource Acquisitions
20 Cooper Square 3rd Floor
New York, NY 10003-711
United States

Richard Bonneau

New York University (NYU) - New York University ( email )

Bobst Library, E-resource Acquisitions
20 Cooper Square 3rd Floor
New York, NY 10003-711
United States

Jonathan Nagler

NYU - Wilf Family Department of Politics ( email )

Dept of Politics - 2nd floor
19 W. 4th Street
New York, NY 10012
United States

Joshua Aaron Tucker

New York University (NYU) ( email )

Bobst Library, E-resource Acquisitions
20 Cooper Square 3rd Floor
New York, NY 10003-711
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
336
Abstract Views
1,537
rank
126,849
PlumX Metrics