Skip to main content



A Theory on Morality: Analyzing Prisoner’s Dilemma

Business Insider: Money and Markets Article

Extended Interview Link

In this Business Insider article, Cornell professor Steven Strogatz uses trends observed in a prisoner’s dilemma experiment to postulate how morality among humans could have self-organized into the form we recognize today. The experiment that Strogatz cites was performed in the 1980s by Robert Axelrod. He set up an iterated prisoner’s dilemma tournament in which computer programs played each other, with each deciding whether it would chose to cooperate with the other computer or defect. We know from our studies on game theory that in this particular game, the best response for each player, regardless of the other player’s choice, is to defect. This results in both prisoner’s serving a moderate amount of jail time, more than they would had both cooperated with each other, but less than they had cooperated while their opponent defected. In Axelrod’s experiment, the game was played multiple times, with the computers caching their opponents’ previous choices. The program that ended up winning the most of the rounds had a “tit for tat” strategy, in which it took the opponents previous choice, either cooperate or defect, and did the same in the current round. Strogatz comments that this strategy parallels the Old Testament concept of morality, “an eye for an eye.” What is interesting to note is that the experiment run by Axelrod was an ideal version of real life, where everybody perfectly understood the other player’s action. Strogatz notes that in reality, there can be miscommunications between the players.  One player might think their opponent was going to cooperate, and thus they choose to do the same, and receive the most severe penalty when the other prisoner instead defects because they think the other player will defect. In these situations,  the players are both trying to pick the option that is best for both, but errors occur in gameplay. Studies of the prisoners dilemma game in this mode found that many players adopted forgiving strategies, and were willing to take the chance of getting the most severe penalty again for the chance that their opponent would instead choose to cooperate, giving them both short punishments. Strogtaz likens this to the principle in the New Testament of “turning the other cheek.” When players adopted this strategy, they avoided becoming caught up in the revenge loop of multiple defections that can result from the tit for tat strategy when played in a game where misunderstandings occur.

In either case, the parallel between the trends observed in prisoner’s dilemma games and examples of human morality and action, particularly emphasized by biblical references does seem to support that acting simply acting with rationale thought leads to following accepted moral codes. Hence, Strogatz suggests that one explanation for the morals we adhere to today is not that we inherently know them, but that we learned that we can get the furthest by following a certain code. While this is by no means a thoroughly researched and documented proposal, I find it interesting that simple situations analyzed by game theory seem to lend explanations to deeply thought-provoking questions in human existence and behavior. That such connections can exist surely shows the application of game theory is more than a topic in an undergrad class, but could even be part of the underlying structure of what it is to be human.

Comments

Leave a Reply

Blogging Calendar

September 2016
M T W T F S S
 1234
567891011
12131415161718
19202122232425
2627282930  

Archives