Discrimination with Incomplete Information in the Sharing Economy: Evidence from Field Experiments on Airbnb
35 Pages Posted: 9 Jan 2017 Last revised: 27 Oct 2017
Date Written: December 8, 2016
Recent research has found widespread discrimination by hosts against guests of certain races in online marketplaces. In this paper, we explore ways to reduce such discrimination using online reputation systems. We conduct three randomized field experiments among 1,508 hosts on Airbnb by creating fictitious guest accounts and sending accommodation requests to them. We find that requests from guests with African American-sounding names are 19.2 percentage points less likely to be accepted than those with White-sounding names. However, a review posted on a guest's page significantly reduces discrimination: when guest accounts receive a positive or non-positive review, the acceptance rates of guest accounts with White-sounding and African American-sounding names are statistically indistinguishable. We further show that the self-claimed information that signals guest's tidiness and friendliness cannot reduce discrimination, indicating the importance of incentivising peer-generated reviews. Our finding is consistent with statistical discrimination: when lacking relevant information, hosts infer the quality of a guest by using their prior beliefs which are based on the average quality of each racial group; when more information is provided, hosts give less weight on guests' race when assessing their quality, and statistical discrimination is reduced. Our results offer direct and clear guidance for sharing-economy platforms to reduce discrimination.
Keywords: Discrimination, Field Experiment, Information Sharing, Service Operations, Sharing Economy
Suggested Citation: Suggested Citation