Prof. Daniel Kahneman and Prof. Maya Bar Hillel

In 2002, Daniel Kahneman, along with Vernon Smith, received the Nobel Prize in economics. Kahneman received his prize “for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty.”

Kahneman did most of his important work with Amos Tversky, who died in 1996. Before their work, economists had gotten far in their analyses of decision making under uncertainty by assuming that people correctly estimate probabilities of various outcomes or, at least, do not estimate these probabilities in a biased way (see Rational Expectations). Even if some people place too low a probability on an event relative to what was reasonable, economists argued, others will place too high a probability on that same event and the results would cancel out. But Kahneman and Tversky found that this is not true: the vast majority of people misestimate probabilities in predictable ways.

One bias they found is that people tend to believe in “the law of small numbers”; that is, they tend to generalize from small amounts of data. So, for example, if a mutual fund manager has had three above-average years in a row, many people will conclude that the fund manager is better than average, even though this conclusion does not follow from such a small amount of data. Or if the first four tosses of a coin give, say, three heads, many people will believe that the next toss is likely to be tails. Kahneman saw this belief in his own behavior as a young military psychologist in the Israeli army. Tasked with evaluating candidates for officer training, he concluded that a candidate who performed well on the battlefield or in training would be as good a leader later as he showed himself to be during the observation period. As Kahneman explained in his Nobel lecture, “As I understood clearly only when I taught statistics some years later, the idea that predictions should be less extreme than the information on which they are based is deeply counterintuitive.”1

Another bias Kahneman and Tversky found to be common in people’s thinking is “availability,” whereby people judge probabilities based on how available examples are to them. So, for example, people overstate the risk from driving without a seat belt if they personally know someone who was killed while driving without. Also, repetition of various stories in the news media, such as stories about children being killed by guns, causes people to overstate the risk of guns to children (see risk and safety).

Kahneman and Tversky also introduced “prospect theory” to explain some systematic choices most people make—choices that contradict the strictly rational model. Kahneman later admitted that their theory’s name was meaningless, but that it was important for getting others to take it seriously, thus giving even more evidence that the framing of an issue matters. (See the next paragraph for an example.) Imagine, for example, that someone is given a chance to bet $40 on some outcome and that he is told, accurately, that his probability of winning $40 is 60 percent, which means that his probability of losing $40 is 40 percent. Most people will refuse such a bet. Kahneman and Tversky called this “loss aversion.” This could be written off as simple risk aversion, which is certainly not irrational. What makes it strange, if not outright irrational, is that people act so differently with bigger gambles. Kahneman and Tversky found, for example, that seven out of ten people prefer a 25 percent probability of losing $6,000 to a 50 percent probability of losing either $4,000 or $2,000, with an equal probability (25 percent) for each. In each case the expected loss—that is, the loss multiplied by its probability, is $1,500. But here they prefer the bigger loss ($6,000) to the smaller one ($2,000 or $4,000). This choice demonstrates what economists calling risk-loving behavior, the opposite of the risk aversion noted for the smaller bets.

Kahneman and Tversky also used prospect theory to explain other systematic behavior that departs from the economist’s rationality assumption. Consider the following situation. Many people will drive an extra ten minutes to save $10 on a $50 toy. But they will not drive ten minutes to save $20 on a $20,000 car. The gain from driving the extra ten minutes for the car is twice the gain of driving the extra ten minutes for the toy. So a higher percentage, not a lower one, of people should drive the longer distance for the saving on the car. Why don’t they? Kahneman’s and Tversky’s explanation is that the framing of the issue affects the decision. Instead of comparing the absolute saving in price against the cost of going the extra distance, people compare the percentage saving, and the percentage saving in the case of the car is very small.

Kahneman’s and Tversky’s work spurred a great deal of work by economists on systematic departures from rational behavior. See behavioral economics for an introduction to many of the issues.

Kahneman was born in Tel Aviv, Israel, and grew up in France. He earned his B.A. in psychology and mathematics at Hebrew University in 1954 and his Ph.D. in psychology from the University of California at Berkeley in 1961. He was a psychology professor at Hebrew University from 1961 to 1978, at the University of British Columbia from 1978 to 1986, at the University of California at Berkeley from 1986 to 1994, and has been a professor at Princeton University since 1993.


Selected Works

 

1972 (with Amos Tversky). “Subjective Probability: A Judgment of Representativeness.” Cognitive Psychology 3: 430–454.
1973 (with Amos Tversky). “On the Psychology of Prediction.” Psychological Review 80: 237–251.
1974 (with Amos Tversky). “Judgment Under Uncertainty: Heuristics and Biases.” Science 185: 1124–1131.
1979 (with Amos Tversky). “Prospect Theory: An Analysis of Decision Under Risk.” Econometrica 47: 263–291.
1986 (with Jack Knetsch and Richard Thaler). “Fairness and the Assumptions of Economics.”Journal of Business 59: S285–S300.

 


Footnotes