If you could confront the pickpocket who ripped you off in the subway, would you simply demand your wallet back, or would you seek vengeance? Your decision to punish the thief might hinge on whether the thief ended up richer than you, a new study suggests.
According to most economic theories, self-interest is the prime motivator in human behavior. However, studies show that people consistently sacrifice their own welfare to punish cheats. For example, in a classic economic experiment called the "ultimatum game," one person holds a certain number of dollars and can offer as many as she likes to a second player. If the second player rejects the offer, the first player loses everything. Rather than accepting any offer, the second player will consistently reject low offers, preferring to receive nothing than to allow her rival to retain the larger sum.
In 1999, Swiss economists Ernst Fehr and Klaus Schmidt defined this spiteful reaction toward cheats and freeloaders as "inequity aversion." They hypothesized that such behavior is essential for cooperation and bargaining, and that it is separate from the desire for revenge, or "reciprocity," as social scientists call it. However, says Fehr, it isn't easy to tease apart the two motivations in experiments, much less real life. "This is a long-standing question that has not been answered to our full satisfaction."
The most recent stab at the problem comes from evolutionary biologists Nichola Raihani of University College London and Katherine McAuliffe of Harvard University. To discover what motivates people to punish cheaters, the researchers recruited 560 volunteers through an online labor market to play a simple game over the Internet. The subjects paired off, with one person assigned to be the cheater, while the other was assigned to be cheated. The pairs played one of three scenarios. In the first scenario, the cheating partner started out with significantly less money than the non-cheating partner. The cheating partner could choose to "steal" 20 cents, but this did not increase his fortunes enough to equal his partner's. In the second scenario, the money was distributed so that if the thieving partner stole 20 cents, he could match his partner's wealth. Only in the third scenario would stealing 20 cents allow the cheating partner to exceed the other partner's fortune.
Things got interesting when the swindled partners were given a chance to pay 10 cents to punish the cheaters, says Raihani. In the first two scenarios, roughly the same proportion of non-cheating partners paid to punish the cheats. Some petty non-cheaters even punished partners who had chosen not to cheat—a common example of "baseline horrible behavior" says Raihani. When the cheating partner's wealth ended up surpassing the other partner's income, however, punishment more than doubled. "I was really surprised," says Raihani. "I really thought that we would find the opposite result." She says that the study, published online today in Biology Letters, supports the hypothesis that a sense of fairness, rather than the desire to dole out tit-for-tat, motivates punishment.
Not so fast, says Herbert Gintis, a behavioral scientist and economist at the Santa Fe Institute in New Mexico. He claims that a flaw in the study design makes it impossible to rule out the possibility that cheated partners were seeking revenge. People punish a bad intention, he says, and despite being anonymous participants in a computer game, all the cheated players knew that their partners were real people who intended harm. The only way to rule out reciprocity as a motivating factor, he says, would be to add a control in which the cheating partner was a computer rather than a person.
Fehr's assessment is somewhat more positive. "I wrote the theory of inequity, so naturally I like the results of this paper," he says with a laugh. Fehr says the data look good, and show that unfairness in the distribution of income has a strong effect on people's decision to punish or not to punish. However, he agrees with Gintis that the study doesn't rule out the influence of reciprocity, and that using a computer control would be the best way to make the study watertight. If people punished a computer in the same way that they did the human cheaters, he says, "that would be the proof."