In the emergence of food sharing, "the Inuit knows that the best place for him to store his surplus is in someone else's stomach."
There is a tension between self-interest and collective action. Yet, symbiosis and cooperation have been observed at every level from cell to ecosystem. There are some genetic reasons why this is so. It was that stream of thought that originated my interest in the subject of an innovation commons (Creating an Innovation Commons).
When we get to the level of human interaction, factors other than genetics play a vital role. Game theory is a tool that has helped us understand how and why we cooperate.
Howard Rheingold treats the subject in Smart Mobs. He writes, "Game theory is based on several assumptions; that the players are in conflict, that they must take action, that the results of the actions will determine which player wins according to definite rules, and that all players (this is the kicker) are expected to always act rationally by choosing the strategy that will maximize their gain regardless of the consequences to others. These are the kind of rules that don't fit real life with predictive precision, but they do attract economists, because they map onto behavior of observable phenomena like markets, arms races, cartels, and traffic."
The game that has attracted a lot of attention is Prisoner's Dilemma. Moreover, it is of interest to us because it has something to say about cooperation.
Basically, Prisoner's Dilemma story is this:
Two are charged with the same crime and are being held separately by the police. The prisoners cannot communicate with each other. The prisoner who testifies against his/her partner will go free, and the partner will be sentenced to three years in jail. If both prisoners decide to testify against each other, they each will get a two-year sentence. And, if neither testifies, they will both get a one-year sentence.
Clearly if they both pursue their common interest, collectively they serve the shortest amount of time, 2 years. The other solutions are 3, 3 and 4 years collectively. However, if one decides to purse their self-interest, i.e. think only of them self, they could go free. But, if both act in what they perceive to be their own self-interest, they maximize their loss, individually and collectively.
The game became really interesting, on both a practical and theoretical level, when they began to model the Interactive Prisoner's Dilemma. The game is not played just once, but many times. In this case, history matters. What happened the time before, or all the times before, does influence the present game. Also, the future impacts the present. How might the other player react in the future to my actions now? In Rheingold's words, "'Reputation' is another way of looking at this 'shadow of the future.'"
This is a game simulation ideal for computers. In a now famous experiment Robert Axelrod proposed a "Computer Prisoner's Dilemma Tournament" wherein various strategies of playing the game, represented by computer programs, would play against each other. "He ran fourteen entries against each other and against a random rule over and over. 'To my considerable surprise,' Axelrod reported, "the winner was the simplest of all the programs submitted, TIT FOR TAT. TIT FOR TAT is merely the strategy of starting with cooperation and thereafter doing what the other player did on the previous move.'"
Axelrod repeated the experiment with professors of evolutionary biology, physics and computer science. He made them all aware of the results of the first experiment. The results were the same.
These results raised the question of how a cooperative strategy could gain a foothold in a predominant uncooperative environment. Axelrod's experiments showed that, "Within a pool of entirely uncooperative strategies, cooperative strategies evolve from small clusters of individuals who reciprocate cooperation, even if the cooperative strategies have only a small proportion of their interactions with each other. Clusters of cooperatives amass points for themselves faster than defectors can. Strategies based on reciprocity can survive against a variety of strategies, and 'cooperation, once established on the basis of reciprocity, can protect itself from invasion by less cooperative strategies. Thus the gear wheels of social evolution have a ratchet.'"
"Cooperatives can thrive amid populations of defectors if they learn how to recognize one another and interact with one another," he concludes. "Cooperators who clump together can out compete noncooperative strategies by creating public good that benefit themselves but not the defectors...Reciprocity, cooperation, reputation, social grooming and social dilemmas all appear to be fundamental pieces of the smart mob puzzle."
If it is the best strategy for an Inuit to store his surplus of food, a scarce resource, in the stomach of another, surely it is better to store knowledge, an abundant resource, in the minds of others. The food gets used up. The knowledge generates new knowledge.
Resources:
1. Smart Mobs, Howard Rheingold, Basic Books, 2002, pages 38 - 46
2. Game Theory - Wikipedia, http://en.wikipedia.org/wiki/Game_theory (9 pages)
3. Game Theory, Stanford Encyclopedia of Philosophy, http://plato.stanford.edu/entries/game-theory/ (40 pages)
4. Prisoner's Dilemma, http://www.iterated-prisoners-dilemma.net/, an iterated prisoner's dilemma game and simulation
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment