Years ago, I took part in a mediation for a small consumer class action–not my first, and hardly my last. In that case, the mediator did something unusual, she showed up with homemade chocolate chip cookies. The mediation did not result in a settlement, but it went pretty smoothly. Afterwards, I asked the mediator about the cookie gambit. "I find it tends to promote cooperation," she said. "No one gets too aggressive around homemade cookies."
Was she right? There’s been a lot of research into the phenomenon of priming, the idea that verbal or visual cues can predispose us to act in certain ways. But can it really effect significant changes in behavior?
The Prisoner’s Dilemma, for those who don’t geek out over game theory, is an example of a problem from game theory. It was first developed in the 1950s as a way of explaining why individuals might choose selfish paths that led to worse outcomes than cooperation, even when they knew that cooperating would lead to better outcomes.
The story that accompanies the Prisoner’s Dilemma goes like this: the police arrest two criminals, and they put them in separate interrogation rooms. The DA has enough evidence to put them each away for a year, but if she can get one of them to confess, she can put the other away for 10 years. It both confess, they will each go to jail for six years.
So each prisoner faces a choice: talk or stay quiet. If both stay quiet, they each serve only a year in jail. So what’s the dilemma? Sitting alone in an interrogation room, neither prisoner knows what the other will do. What each does know is this: if he stays quiet, he winds up in prison no matter what, and he could serve ten years if his partner confesses. If he talks he might do six years, or he might do none, and he knows the other prisoner is thinking the same way. And there is no honor among thieves …
Liberman et al. conducted a series of experiments using the Prisoner’s Dilemma with both Stanford undergraduates and Israeli Defense Force pilots. But then they proceeded to do did something interesting with the game; they renamed it. For half of their test subjects, they called it "The Wall Street Game." With others, they called it "The Community Game."
The result? The name of the game had a significant effect on players’ strategies. When the Stanford students played, half of the Wall Street games resulted in mutual defections. And half of the Community games played resulted in mutual cooperation. The game was structured slightly differently for the Israeli pilots, but provided similar results. When the game was called the Bursa (Wall Street) game, 16 of the 20 participants started out by defecting. When it was called the Kommuna (Community) game, 11 of the 20 participants started out by cooperating.
One further twist in the experiment provides two more insights. In each case, the experimenters chose their subjects by going to authority figures who knew the participants. For the Stanford students, they went to resident advisors; for the pilots, they went to flight instructors. In each case, they asked the authority figures to stack the deck by giving them the people they thought were most cooperative and the ones they thought were most competitive.
Statistical analysis of the game results showed two things: (1) priming works even on people we think are predisposed to act in the opposite way, and (2) people tend to be very bad at predicting how competitive or cooperative someone will be based on personality alone.