Daniel Kahneman Interview: Conversations with History; Institute of International Studies, UC Berkeley

Intuition and Rationality: Conversation with Daniel Kahneman, Nobel Laureate in Economics (2002), Eugene Higgins Professor of Psychology, Princeton University; February 7, 2007, by Harry Kreisler

Page 4 of 6

Intuitive Thinking

Let's talk a little now about your research. I know that [the Nobel] prize is built on a whole body of research, but I think it would be useful for our audience to understand the very simple ideas, if I can use your words, that had such power and such impact. In your lecture you said the mind is a system of jumps to conclusions, and you're looking at one piece of that.

Yes. What we studied was intuitive thinking in the domain of judgment under uncertainty, this is how we started. That is, how do people assess the probabilities of events and how do they forecast the future intuitively? And our idea is that people use heuristics, which are basically shortcuts. In my lecture -- it's a phrase I coined this week but I'm going to use it -- I did speak of the mind as a machine for jumping to conclusions. We figured out some of the ways that this is done, and I think the most important one that we studied in the first years of our work was something that I now call substitution. That is, you're asked a difficult question, you cannot answer it, but another answer comes to mind, and it's an answer to a related question which is simpler. Without being aware of it you substitute the answer for the simple question in place of the answer to the complicated one. This is how lay people and non-experts come up with intuitions about very complicated problems that baffle experts. Some of these intuitions are good, others are not so good, but that's the machinery. So, that was the first part of our work on judgment, developing that idea and fleshing it out.

As your thinking about these issues has developed over time you're focusing on perception, intuition, and cognition. Or perception, emotion, and cognition.

The period when we were working was the heyday of what is called the "cognitive revolution." At the time, psychologists were engaged in exploring the mysteries of cognition, and we were no exception. One novel aspect of what we were doing was that we imported the notion of illusion from perception into cognition. We showed that people make errors of intuitive thinking that have many of the characteristics of illusions. That was because I was teaching perception at the time, this was what filled my mind. The analogy of visual perception and how it helps inform how we think about intuition has been very important in my thinking all the way through. I have explored this idea in particular over the last five or six years.

Talk a little about what you call "System One," intuition. People realize they do the sorts of things you're describing, but the description that has developed is [balanced by] what you call "System Two," which is rationality.

This is something that has become explicit in my mind only recently. One of the things we learn is that some ideas are taken for granted. For example, when we worked we took rationality [as a given], that people are able to compute correct answers to problems. That did not need explaining, so we were not looking at that. We were looking at mistakes, but when the field accepted the notion of mistakes, then it became a challenge. People are sometimes able to figure things out and to do it correctly [and sometimes not].

The resolution of that, which I think is becoming widespread -- it's not original to me, it is generally accepted in psychology -- is that there are those two families of mental processes. I call them (along with others, it's not my term either) System One and System Two, one of which is intuitive, rapid, associative, uncontrolled, automatic thinking, and the other the rule-governed [approach], the way you fill your income tax form, or compute 17 times 24, or read a map. We're able to think in this [rule-governed] way, but most of the time we run on the software of System One, that is, automatically and with little thought, and by and large with remarkable success and accuracy.

In an interview you said that with fear, probability does not matter as much. The more emotional the event, the less sensible people are.

This is a development of the last ten or fifteen years. The cognitive revolution is now over and what is happening in the field is that emotion is now being [studied]. The views of decision making have changed, and in a way they're going back to what was common sixty, seventy, eighty years ago, that is, the idea that there are very important conflicts between reason and passion. Nobody was thinking along those lines twenty years ago, but many people are thinking along those lines today. One of the attributes that are constantly being evaluated by System One is the emotional significance of events. So, we're continuously evaluating whether things are good or bad or safe or threatening, and so on. Our emotional responses guide us, and our emotional responses guide the ideas that come to our minds.

I read that very important to your work is the way one defines a problem, in the area of risk and loss, for example. Your thinking, as I understand it, opened up our understanding of how people perceive risk and loss in economic decision making, because previously the focus had been just on wealth, the entire fortune that was at stake.

Again, it's a very simple idea. It is true that for 300 years the focus in the analysis of financial decisions, of decisions about money, has been about wealth. People have talked about the "utility" of wealth, which is the psychological response to wealth and [the way] that people [judge] different prospects and opportunities by evaluating them in terms of what is called finite states of wealth. When I started studying this field under the tutelage of Amos Tversky (it was not my specialty), I was struck by the absurdity of it, and I was stuck by the fact that this is not the way that people think about risk. They think about risk in terms of gains and losses, that you don't know what wealth you have. It turns out that that is a very fundamental reorganization of the field, to think of outcomes in terms of gains and losses and not in terms of final states. So, it opened up a whole agenda that people are still working out.

So, narrowing the focus by having an intuition about the way people are (or the way you are, as you said), is central to what's going on in your work.

There are several stages to this. This idea of gains and losses and that people evaluate changes -- I distinctly remember how I had that idea. I was reading a chapter in a book that Amos Tversky [had co-authored], and it was a chapter on how people analyze decisions. Many important philosophers had been analyzing the so-called utility function for money, and I noticed that they never asked people about wealth. They said, "What about a gamble in which you could win this amount with a certain probability, or that amount for sure?" They didn't do losses much, it was all about changes.

I came back to Amos and I said, "Hey, what's going on here? They are asking questions about this and then they're plotting a function in terms of that. That is psychologically very implausible."

So, you know, you need to be able to pay attention to this, and then you need to realize that it's important.

On that very day it was obvious that we would never look at wealth again, that we were going to do something that is psychologically realistic, because he was immediately convinced. Somebody else had had that idea before, but they hadn't followed through with it, and we did.

So, you need a very simple idea and then you need to see that it's important. Then you need to follow through. It's a combination of tools.

Does your research offer insights to individuals with regard to how they might change their own behavior?

Yes, potentially it does, but they are not insights that are very easy to use. You can be aware of mistakes you make, [but] it's very difficult to learn to avoid them, because System One operates automatically. Occasionally when a decision is very important, then you can stop to think, and you have to recognize that you can stop to think. And then there is still the extra stage, which is a very painful and difficult stage, [where] even if your analysis tells you to do something, you don't want to do it. To impose the discipline of rationality on your desires is extremely difficult. I've just lived something like this in deciding whether or not to write a book: I just want to do it. When I analyze the pros and cons it's absurd for me to do this, but I'm going to do it, I think. So, you know, it's very difficult to impose discipline on yourself.

Next page: Successes and Failures of Intuition

© Copyright 2007, Regents of the University of California