Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users
33 Pages Posted: 25 May 2022 Last revised: 12 Nov 2022
Date Written: May 11, 2022
To what extent does the YouTube recommendation algorithm push users into echo chambers, ideologically biased content, or rabbit holes? Using a novel method to estimate the ideology of YouTube videos and an original experimental design to isolate the effect of the algorithm from user choice, we demonstrate that the YouTube recommendation algorithm does, in fact, push real users into mild ideological echo chambers where, by the end of the data collection task, liberals and conservatives received different distributions of recommendations from each other, though this difference is small. While we find evidence that this difference increases the longer the user followed the recommendation algorithm, we do not find evidence that many go down `rabbit holes' that lead them to ideologically extreme content. Finally, we find that YouTube pushes all users, regardless of ideology, towards moderately conservative and an increasingly narrow range of ideological content the longer they follow YouTube's recommendations.
Keywords: YouTube, Recommendation Algorithm, Echo Chambers, Political Polarization, Theory Testing
Suggested Citation: Suggested Citation