Daniel Kahneman Interview: Conversations with History; Institute of International Studies, UC Berkeley

Intuition and Rationality: Conversation with Daniel Kahneman, Nobel Laureate in Economics (2002), Eugene Higgins Professor of Psychology, Princeton University; February 7, 2007, by Harry Kreisler

Page 5 of 6

Successes and Failures of Intuition

In your lecture you talked about individuals -- a fireman, a nurse -- who develop an intuition that is critical to their career. They're able to see things. Talk a little about that. You are approaching the question of how a person develops and masters their intuition.

One of the important developments in recent years in my life has been [that] I try to understand controversy and I try to reduce controversy. Amos Tversky and I made our reputation by finding flaws in what people do. It was the method we used. It's not that we ever thought that people are stupid, but this is what we were doing. Many people have responded to that by saying that we're drawing a distorted picture of human nature.

One of the people who responded to that is Gary Klein, the guru of a movement that is called Naturalistic Decision Making. They're very interested in intuition and deliberately skeptical about the kind of work that we've done. I approached him because I liked his work, actually, and we've been collaborating on an article, so I was citing in my lecture examples from his work on professional intuitions. There's a fireman, a captain of a firefighting company, on the roof suddenly yelling to his company, "Let's get out of here!," just before the house explodes, and then it turns out he wasn't aware of when he was doing it, but his feet were warm and that was the cue that triggered the sense that something very dangerous was going on just underneath them. That's a beautiful example of a perfect intuition, and that's the kind of thing that has been feeding people who think that all our discussions on biases and mistakes are overstated.

So, it sets a very interesting problem that Gary Klein and I have been trying to sort out together. When do intuitions develop and when don't they? Which experts can you trust and which experts shouldn't you trust?

You say that skills are acquired in an environment of feedback and opportunity for learning in a social network. That would help us understand what makes it possible for [professional intuition] to be [successful].

They think about situations a lot and they talk about things a lot, so they develop models of various kinds of files. They don't have to experience -- you know, we are capable of learning a great deal from simulated experience. Even athletes can learn from simulating things in their minds, and they do: they practice a lot at night without doing anything. This is one piece of machinery that we dispose of. It will not help you in certain domains; it's not something that a CIA analyst can do, because the systems that they deal with are fundamentally more complicated.

Which raises the interesting question of how groups can learn from their own experience. Your work is related to decision making in the marketplace, and in a minute we'll talk about your article in Foreign Policy. In those cases, what is the difference when you have institutions and groups that would like to correct these kinds of errors?

Well, in the first place, my main observation would be that groups, by and large, do not correct errors. That's [from] recurrent observations. There's a lot of lip service paid in organizations about improving the quality of our intelligence and the quality of our decision making but I think it's mainly lip service, because imposing a discipline on decision making, as I illustrated by my example of the book -- you know, I don't want to impose discipline on my decision making, and the leaders of organizations -- civilian and governmental and commercial -- don't like to be second-guessed. It's the rare leader -- [although] there are very salient examples; the Cuban missile crisis is the example that people think about, where President Kennedy developed a deliberating team that was superbly efficient in allowing dissent and in allowing ideas and slowing down the process of decision making to a rate that was appropriate to the complexity of the situation. That's very rare.

In some domains, for example politics, the name of the game becomes not to do what the other party did and how they learned. Recently you had an article in Foreign Policy applying your analysis to foreign policy decision making, which is pessimistic in the sense that it suggests that hawks have a structural advantage. The choices that they lean toward, whatever the issue, fall into many of the [biases] that you've identified in individual decision making.

That's right. I think it's significant; I think it's probably true. I didn't expect to see this, but when I made a list of the biases of human thinking and decision making it turned out that in a situation of conflict, when conflict is about to develop or arise, these biases consistently favor hawks over doves. So that the claims of hawks would resonate with emotions and beliefs, and things that people want to believe, more than the claims of doves. That was the point of that article.

In fact, there's an optimistic bias, "We have all of this military hardware, let's do it," and an illusion of control, "We can go in there and by defeating the adversary we can then take control."

Whenever you look at a serious conflict or a war it's very clear that there was enormous optimism, usually on both sides. It's not only the losing side that was optimistic, the winning side didn't appreciate the costs. And that, I think, is regularly true. It's very difficult to think of a conflict in which anybody had any idea when it began how bad it was going to be for both sides. Almost by definition you get optimism on both sides of every conflict, and when conflict arises I think the tendency [is that] pessimism becomes [interpreted as] disloyalty, doubting our ability to achieve victory. It would take a very, very strong national leader to be able to sustain some doubter in his inner circle when he himself wants to move forward.

And unfortunately, in terms of where we are now, it's difficult to cut losses. I know that if one has a mutual fund that has not done well one doesn't want to move it for fear that you will lose even more by rationally moving it, so ...

People in general don't like cutting their losses. They're willing to gamble on in the hope of recovering their losses, and that is a very well known characteristic of individual decision making, and in national decision making it's exacerbated because the national leaders who have led the country close to defeat, for them there is really nothing further to be lost by putting more at risk. There is a real divergence of interest between national leaders and their communities when the time to cut losses arises, because cutting losses is rarely beneficial to the decision maker.

Next page: Conclusions

© Copyright 2007, Regents of the University of California