Strategic Experimentation with Exponential Bandits
University College London - Department of Economics; Washington University in Saint Louis - John M. Olin Business School
University of Oxford - Department of Economics
University of Bonn; Centre for Economic Policy Research (CEPR); CESifo (Center for Economic Studies and Ifo Institute for Economic Research)
CEPR Discussion Paper No. 3814
This Paper studies a game of strategic experimentation with two-armed bandits whose risky arm might yield a pay-off only after some exponentially distributed random time. Because of free-riding, there is an inefficiently low level of experimentation in any equilibrium where the players use stationary Markovian strategies with posterior beliefs as the state variable. After characterizing the unique symmetric Markovian equilibrium of the game, which is in mixed strategies, we construct a variety of pure-strategy equilibria. There is no equilibrium where all players use simple cut-off strategies. Equilibria where players switch finitely often between the roles of experimenter and free-rider all lead to the same pattern of information acquisition; the efficiency of these equilibria depends on the way players share the burden of experimentation among them. In equilibria where players switch roles infinitely often, they can acquire an approximately efficient amount of information, but the rate at which it is acquired still remains inefficient; moreover, the expected pay-off of an experimenter exhibits the novel feature that it rises as players become more pessimistic. Finally, over the range of beliefs where players use both arms a positive fraction of the time, the symmetric equilibrium is dominated by any asymmetric one in terms of aggregate pay-offs.
Number of Pages in PDF File: 40
Keywords: Strategic experimentation, two-armed bandits, exponential distribution, Bayesian learning, Markov perfect equilibrium, public goods
JEL Classification: C73, D83, H41, O32working papers series
Date posted: May 2, 2003
© 2013 Social Science Electronic Publishing, Inc. All Rights Reserved.
This page was processed by apollo2 in 1.031 seconds