Third-Party Punishment as a Costly Signal of High Continuation Probabilities in Repeated Games
25 Pages Posted: 14 Jun 2016 Last revised: 4 Apr 2017
Date Written: April 4, 2017
Why do individuals pay costs to punish selfish behavior, even as third-party observers? A large body of research suggests that reputation plays an important role in motivating such third-party punishment (TPP). Here we focus on a recently proposed reputation-based account (Jordan et al., 2016) that invokes costly signaling. This account proposed that “trustworthy type” individuals (who are incentivized to cooperate with others) typically experience lower costs of TPP, and thus that TPP can function as a costly signal of trustworthiness. Specifically, it was argued that some but not all individuals face incentives to cooperate, making them high-quality and trustworthy interaction partners; and, because the same mechanisms that incentivize cooperation also create benefits for using TPP to deter selfish behavior, these individuals are likely to experience reduced costs of punishing selfishness. Here, we extend this conceptual framework by providing a concrete, “from-the-ground-up” model demonstrating how this process could work in the context of repeated interactions incentivizing both cooperation and punishment. We show how individual differences in the probability of future interaction can create types that vary in whether they find cooperation payoff-maximizing (and thus make high-quality partners), as well as in their net costs of TPP – because a higher continuation probability increases the likelihood of receiving rewards from the victim of the punished transgression (thus offsetting the cost of punishing). We also provide a simple model of dispersal that demonstrates how types that vary in their continuation probabilities can stably coexist, because the payoff from remaining in one’s local environment (i.e. not dispersing) decreases with the number of others who stay. Together, this model demonstrates, from the group up, how TPP can serve as a costly signal of trustworthiness arising from exposure to repeated interactions.
Keywords: cooperation, trust, direct reciprocity, reputation, evolution, game theory, dispersal
Suggested Citation: Suggested Citation