Crawl Across the Ocean

Tuesday, July 27, 2010

61. The Evolution of Cooperation (part 1 of 2)

Note: This post is the sixty-first in a series about government and commercial ethics. Click here for the full listing of the series. The first post in the series has more detail on the book 'Systems of Survival' by Jane Jacobs which inspired this series.

The Evolution of Cooperation is the title of perhaps the most famous book on the Prisoner's Dilemma, and possibly Game Theory in general, ever written - by Robert Axelrod.

Reading it again, for the first time in a long time, I could see why it is so popular - it manages to cover a lot of ground with very clear, accessible prose.

The Evolution of Cooperation starts off by recounting a famous game theory tournament. Participants were invited to submit a strategy or 'rule' that would play a Prisoner's Dilemma against strategies submitted by other people. The strategies would be paired up against each other in turn and would play a repeated Prisoner's Dilemma against each other for a certain number of times. The goal was to achieve the highest possible point total, adding up across the matches against all the other strategies.

Recall that the nature of the Prisoner's Dilemma is such that, no matter what action your opponent takes, you will maximize your own total by defecting rather than cooperating. But by changing the situation from a single game to a repeated game, and by allowing participants to retain a memory of what happened before and by allowing them to clearly identify who they were playing against, the tournament introduced a strong signalling element into the Dilemma.

The tournament was won by the simplest strategy submitted, a strategy known as 'Tit For Tat.' Tit for Tat started off by cooperating (Axelrod refers to strategies that start by cooperating as 'nice' strategies), and then each round it just reacts to what the strategy it is matched up with did the previous round. If the strategy it is playing with defected on the last round, Tit for Tat defects this round, and if the strategy it is playing against cooperated on the last round, Tit for Tat cooperates this round.

After the results of the first tournament were published, a second one with more entries was held, but Tit for Tat again turned out to be the winner.

Strategies aren't fixed over time, and people might change their approach if they see another approach that is working better. Or those using a poor strategy might die out (or get fired) and be replaced by someone with a better strategy. Or some people may simply decide to try a new approach that they thought up. Through these sorts of mechanisms, the distribution of strategies, or rules, being used in the population can evolve over time.

An evolutionarily stable strategy is one that, even if everybody in a population is using it, can't be invaded by some other strategy designed to take advantage of it. Axelrod notes that a population where everybody defects is evolutionary stable because it is not possible for anyone playing any sort of cooperative strategy to invade (because they never meet anyone who will reciprocate their cooperation). But even a small cluster of cooperators can invade a much larger population of defectors if the conditions are right (because they will do well enough cooperating with each other to offset their poor results against the defectors).

But the converse is not true. A population where everybody plays a nice strategy like Tit for Tat can't be invaded by an 'Always Defect' strategy, because the Tit for Tats will do better playing each other than the 'Always Defect's will do playing with each other. This is a hopeful result (for those who like to see cooperation) since it suggests that a cooperative equilibrium is more stable than a defective one and that even a small group of cooperators can sometimes thrive in a sea of defectors.

Based on the results of the tournaments, and the success of Tit for Tat, Axelrod offers the following suggested courses of action for doing well in a repeated Prisoner's Dilemma type situation:

1) Don't be envious

As we saw before, envy can transform an absolute gain into a relative loss and a positive sum situation into a zero-sum situation. A common theme throughout the book is the distinction between absolute gains, made possible by the non zero-sum nature of the Prisoner's Dilemma, and zero-sum situations where only relative gains are possible.

2) Don't be the first to defect

'Nice' rules which don't defect first, will do well when playing with each other. This means that 'Mean' rules which defect first, will end up with lower scores against 'nice' opponents than 'Nice' rules do.

3) Reciprocate both cooperation and defection

A failure to reciprocate cooperation leads to unnecessary defection on both sides. A failure to reciprocate defection (by defecting in return the next round) leads to being taken advantage of.

4)Don't Be Too Clever

Unlike in a zero-sum game where you don't want your opponent to have any advantage, in a Prisoner's Dilemma it is important that those who are willing to cooperate recognize that you are willing to cooperate as well. Tit for Tat is a simple rule that helps other rules understand what they are dealing with and act accordingly. And since the best plan when facing Tit for Tat is to cooperate, rules will generally cooperate when they figure out that is the rule their opponent is using.

* * *

Moving along, Chapter 4 shows that friendship is not necessary for cooperation to develop by recounting the story of the 'live and let live' system that developed in the trenches during World War I where enemy units would cooperate by not killing each other, while facing off with each other across the same piece of ground for months at a time.

Chapter 5 shows that even creatures with very limited intelligence (e.g. bacteria) can engage in cooperation in Prisoner's Dilemma type situations. It also theorizes that the cooperation born from Kin Selection (the notion that it makes sense for us to evolve so that we are willing to make sacrifices for those we share genes with) might have provided a foothold of cooperation that could have spread into the sort of reciprocal tit for tat cooperation that would extend across larger groups of people, regardless of whether they are related or not.

I'll cover the rest of 'The Evolution of Cooperation' and talk about some of the implications of the ideas covered in it in next week's post.

Labels: , , , , , ,

0 Comments:

Post a Comment

<< Home