Crowding Out the Truth? A Simple Model of Misinformation, Polarization and Meaningful Social Interactions
38 Pages Posted: 17 Oct 2022 Last revised: 10 Dec 2022
Date Written: October 1, 2022
Social media are at the center of countless debates on polarization, misinformation, and even the state of democracy in various parts of the world. An essential feature of social media is the ranking algorithm that determines how content is presented to the users. This paper studies the dynamic feedback between a ranking algorithm and user behavior, and develops a theoretical framework to evaluate the effect of popularity and personalization parameters on measures of platform and user welfare. The model shows the presence of a fundamental trade-off between platform engagement and user welfare. A higher weight assigned to online social interactions such as likes and shares and to personalized content, increases engagement while having a detrimental effect in terms of misinformation---crowding-out the truth---and polarization. Besides increasing actual polarization, an increase in the weight assigned to social interactions may also increase perceived polarization, as it makes it more likely for individuals to see more extreme content---both like-minded and not---in higher-ranked positions. Finally, we provide empirical evidence in support of the main predictions of our model. By leveraging a rich survey dataset from Italy and exploiting Facebook's 2018 ``Meaningful Social Interactions'' update---which significantly boosted the weight given to social interaction in its ranking algorithm---we find an increase in political polarization and ideological extremism in Italy following the change in Facebook's algorithm.
Keywords: Algorithmic Gatekeeper, Ranking Algorithms, Popularity Ranking, Personalized Ranking, Meaningful Social Interactions, Engagement, Polarization, Misinformation
JEL Classification: D72, D83, L82, L86
Suggested Citation: Suggested Citation