Observational Learning in Large-Scale Congested Service Systems
34 Pages Posted: 18 Oct 2017
Date Written: October 17, 2017
We study the impact of observational learning in large scale congested service systems with servers having heterogenous quality levels and customers that are heterogonously informed about the server quality. Providing congestion information to all customers allows them to avoid congested servers, but, also implies that less informed customers learn about the quality from observing the choices of other customers. Due to an exponentially growing state space in the number of servers, identifying Bayesian equilibria is intractable with a large, discrete number of servers. In this paper, we develop a tractable model with a continuum of servers. We find that the impact of observational learning on the customers' choice behavior may lead to severe "imbalance" of server load in the system, such that a decentralized system significantly under-performs in terms of the social welfare, compared with a centralized system. The decentralized system performs well only when (a) either the congestion costs are high and there are sufficient informed customers, or (b) when the congestion costs are medium or low and the aggregate capacity of high-quality servers matches the aggregate demand of informed customers. We also find situations in which making more customers informed about service quality leads to a decrease in social welfare. Our paper highlights the tension between observational learning and social welfare maximization and thus observational learning in large-scale service systems might require intervention of the platform manager.
Keywords: Observation Learning, Bayesian Inference, Load-balancing
JEL Classification: D83, D7
Suggested Citation: Suggested Citation