The Evolution of Cooperation

Preparing to load PDF file. please wait...

0 of 0
The Evolution of Cooperation

Transcript Of The Evolution of Cooperation

The Evolution of Cooperation

Robert Axelrod

Basic Books, Inc., Publishers

New York

An earlier version of chapter 2, tables 1-5, and figures 1-2 appeared in The Journal of Conflict Resolution (1980).
An earlier version of chapter 3 appeared in the American Political Science Review (1981).
An earlier version of chapter 5 appeared in Science 211 (27 March 198l):1390-96. Copyright 1981 by the American Association for the Advancement of Science.

Library of Congress Cataloging in Publication Data

Axelrod, Robert M. The evolution of cooperation.

Iiibliography: p. 223

Includes index.

1. Cooperativeness. 2. Games of strategy (Mathematics)

3. Conflict management. 4. Egoism. 5. Consensus (Social

sciences) 6. Social interaction. I. Title.

HM131.A89 1984

302'. 14 83-45255

ISBN 0-465-02122-0 (cloth)

ISBN 0-465-02121-2 (paper)

Copyright © 1984 by Robert Axelrod Printed in the United States of America
Designed by Vincent Torre 86 87 88 MPC 9 8 7 6 5 4 3 2





1. The Problem of Cooperation


The Emergence of Cooperation

2. The Success of TIT FOR TAT in

Computer Tournaments


3. The Chronology of Cooperation


Cooperation Without Friendship or Foresight

4. The Live-and-Let-Live System in Trench Warfare

in World War I



5. The Evolution of Cooperation in Biological Systems (with William D. Hamilton)

Contents 88

Advice for Participants and Reformers

6. How to Choose Effectively


7. How to Promote Cooperation



8. The Social Structure of Cooperation


9. The Robustness of Reciprocity














THIS PROJECT began with a simple question: When should a person cooperate, and when should a person be selfish, in an ongoing interaction with another person? Should a friend keep providing favors to another friend who never reciprocates? Should a business provide prompt service to another business that is about to be bankrupt? How intensely should the United States try to punish the Soviet Union for a particular hostile act, and what pattern of behavior can the United States use to best elicit cooperative behavior from the Soviet Union?
There is a simple way to represent the type of situation that gives rise to these problems. This is to use a particular kind of game called the iterated Prisoner's Dilemma. The game allows the players to achieve mutual gains from cooperation, but it also allows for the possibility that one player will exploit the other, or the possibility that neither will cooperate. As in most realistic situations, the players do not have strictly opposing interests. To find a good strategy to use in such situations, I invited experts in game theory to submit programs for a Computer Prisoner's Dilemma Tournament—much like a computer chess tournament. Each program would have available to it the history of the interaction so far and could use this history in making its choice of whether or not to cooperate on the current

move. Entries came from game theorists in economics, psychology, sociology, political science, and mathematics. I ran the fourteen entries and a random rule against each other in a round robin tournament. To my considerable surprise, the winner was the simplest of all the programs submitted, TIT FOR TAT. TIT FOR TAT is merely the strategy of starting with cooperation, and thereafter doing what the other player did on the previous move.
I then circulated the results and solicited entries for a second round of the tournament. This time I received sixty-two entries from six countries. Most of the contestants were computer hobbyists, but there were also professors of evolutionary biology, physics, and computer science, as well as the five disciplines represented in the first round. As in the first round, some very elaborate programs were submitted. There were also a number of attempts to improve on TIT FOR TAT itself. TIT FOR TAT was again sent in by the winner of the first round, Anatol Rapoport of the University of Toronto. Again it won.
Something very interesting was happening here. I suspected that the properties that made TIT FOR TAT so successful in the tournaments would work in a world where any strategy was possible. If so, then cooperation based solely on reciprocity seemed possible. But I wanted to know the exact conditions that would be needed to foster cooperation on these terms. This led me to an evolutionary perspective: a consideration of how cooperation can emerge among egoists without central authority. The evolutionary perspective suggested three distinct questions. First, how can a potentially cooperative strategy get an initial foothold in an environment which is predominantly noncooperative? Second, what type of strategy can thrive in a variegated environment composed of other individuals

using a wide diversity of more or less sophisticated strategies? Third, under what conditions can such a strategy, once fully established among a group of people, resist invasion by a less cooperative strategy?
The tournament results were published in the Journal of Conflict Resolution (Axelrod 1980a and 1980b), and are presented here in revised form in chapter 2. The theoretical results about initial viability, robustness, and stability were published in the American Political Science Review (Axelrod 1981). These findings provide the basis for chapter 3.
After thinking about the evolution of cooperation in a social context, I realized that the findings also had implications for biological evolution. So I collaborated with a biologist—William Hamilton—to develop the biological implications of these strategic ideas. This resulted in a paper published in Science (Axelrod and Hamilton 1981) which appears here in revised form as chapter 5. The paper has been awarded the Newcomb Cleveland Prize of the American Association for the Advancement of Science.
This gratifying response encouraged me to present these ideas in a form that would make them accessible not only to biologists and mathematically oriented social scientists but also to a broader audience interested in understanding the conditions that can foster cooperation among individuals, organizations, and nations. This in turn led me to see applications of the ideas in a great variety of concrete situations, and to appreciate how readily the results could be used to generate implications for private behavior and for public policy.
One point worth stressing at the outset is that this approach differs from that of Sociobiology. Sociobiology is based on the assumption that important aspects of human behavior are guided by our genetic inheritance (e.g., E. O.

Wilson 1975). Perhaps so. But the present approach is strategic rather than genetic. It uses an evolutionary perspective because people are often in situations where effective strategies continue to be used and ineffective strategies are dropped. Sometimes the selection process is direct: a member of Congress who does not accomplish anything in interactions with colleagues will not long remain a member of Congress.
It is a pleasure to acknowledge the help received at various stages of this project from Jonathan Bendor, Robert Boyd, John Brehm, John Chamberlin, Joel Cohen, Lou Erste, John Ferejohn, Patty French, Bernard Grofman, Kenji Hayao, Douglas Hofstadter, Judy Jackson, Peter Katzenstein, William Keech, Martin Kessler, James March, Donald Markham, Richard Matland, John Meyer, Robert Mnookin, Larry Mohr, Lincoln Moses, Myra Oltsik, John Padgett, Jeff Pynnonen, Penelope Romlein, Amy Saldinger, Reinhart Selten, John David Sinclair, John T. Scholz, Serge Taylor, Robert Trivers, David Sloan Wilson, and especially Michael Cohen. I would also like to thank all the people whose entries made the tournaments possible. Their names are given in appendix A.
With gratitude I acknowledge the institutions that made this work possible: the Institute of Public Policy Studies of The University of Michigan, the Center for Advanced Study in the Behavioral Sciences, and the National Science Foundation under Grant SES-8023556.