28 Pages Posted: 9 Jan 2017
Date Written: December 8, 2016
Recent research has found widespread discrimination by hosts against guests of certain races in online marketplaces, which endangers the very basis of a sharing economy — building trust in the communities. In this paper, we explore the root cause of discrimination and how to eliminate discrimination. We conducted two randomized field experiments among 1,256 hosts on Airbnb by creating fictitious guest accounts and sending accommodation requests to them. We find that requests from guests with distinctively African American names are 19 percentage points less likely to be accepted than those with distinctively White names. However, a public review posted on a guest’s page mitigates discrimination: when guest accounts receive a positive review, the acceptance rates of guest accounts with distinctively White and African American names are statistically indistinguishable. We further demonstrate that a negative review also eliminates discrimination. Our finding is consistent with statistical discrimination: when lacking perfect information, hosts infer the quality of a guest by race and make rental decisions based on the average predicted quality of each racial group; when enough information is shared, hosts do not need to infer guests’ quality from their race, and discrimination is eliminated. Our results offer direct and clear guidance for sharing-economy platforms on how to reduce discrimination. Platform owners should motivate users to write reviews of one another and design a better mechanism to facilitate information sharing — especially information that signals guest quality.
Keywords: Discrimination, Field Experiment, Service Operations, Sharing Economy, Information Sharing
Suggested Citation: Suggested Citation
Cui, Ruomeng and Li, Jun and Zhang, Dennis J., Discrimination with Incomplete Information in the Sharing Economy: Field Evidence from Airbnb (December 8, 2016). Available at SSRN: https://ssrn.com/abstract=2882982