Episode 4: Intuition and Rationality

Episode 4: Intuition and Rationality

“3 questions … Thinking about thinking … Thinking fast and slow … Conversation with Daniel Kahneman (Part 1) … Pupils and anchors … Celebrity spotting … Availability … Linda the feminist bank teller Meet Rudy … Becoming an expert … Conversation with Daniel Kahneman (Part 2) … In a nutshell … Uncut conversation with Daniel Kahneman”
(Source URL)

Summaries

  • Episode 4 - Intuition and Rationality > Thinking about thinking > Thinking about thinking
  • Episode 4 - Intuition and Rationality > Thinking fast and slow > Thinking fast and slow
  • Episode 4 - Intuition and Rationality > Conversation with Daniel Kahneman (Part 1) > Conversation with Daniel Kahneman (Part 1)
  • Episode 4 - Intuition and Rationality > Pupils and anchors > Pupils and anchors
  • Episode 4 - Intuition and Rationality > Availability > Availability
  • Episode 4 - Intuition and Rationality > Linda the feminist bank teller > Linda the feminist bank teller
  • Episode 4 - Intuition and Rationality > Becoming an expert > Becoming an expert
  • Episode 4 - Intuition and Rationality > Conversation with Daniel Kahneman (Part 2) > Conversation with Daniel Kahneman (Part 2)
  • Episode 4 - Intuition and Rationality > In a nutshell > In a nutshell
  • Episode 4 - Intuition and Rationality > Uncut conversation with Daniel Kahneman > Uncut conversation with Daniel Kahneman

Episode 4 – Intuition and Rationality > Thinking about thinking > Thinking about thinking

  • I mean, some common examples are: we only use 10 percent of our brains, or right-brained people are more creative than left-brained people, or you can tell whether someone’s lying from a polygraph test.
  • I think you’re exactly right, but we also need to-in order to understand the way that the mind is working and give people a realistic idea about the kind of processes and so on that happen, we need to understand the environment that we’re operating in-specifically, when we’re making everyday sorts of decisions.
  • What sorts of decisions do we make every day? What to wear, or whether to buy a lawnmower or not, or what kind of brand that you should buy, whether to buy a latex mattress or a box-spring mattress, or whether to stay in the same job that you have, choosing a mate, going through a grocery store, what sort of things that you purchase at the grocery store, and so on, whether to spank your child, whether to be a vegetarian or not-the list goes on.
  • If that’s the case, most of these decisions that we make aren’t under ideal conditions.
  • You’d have a big desk completely empty, with paper; you could have a pros and cons list, and you can weigh out whether to spank your child or not.
  • If you go into The Sleep Store, they would absolutely say that there’s a massive difference between them and try to upsell you, but I don’t have all day to sit and do the necessary research to be able to make this sort of decisions.
  • Whether you decided to go to the movies instead of going to dinner, whether you decided to become a vegetarian, how can you possibly, with a sample size of one, evaluate the quality of that decision? How do you know you are right? Yes.
  • The other thing is even if somebody does have the right answer, if somebody knows whether it was the right decision or not, people aren’t going to tell you because we have these strange social ideas that you shouldn’t critique people too much, so feedback is terrible.
  • The world is extremely complex and extremely ambiguous, right? Most of the decisions that we’re making aren’t immediately apparent, so when we’re talking about complex, any of ambiguity, I mean-we gave a few toy examples in the beginning, in episode two, where we were talking about different illusions and how you see them differently.
  • When you’re determining whether somebody is talented or not, you can look at that notion in a million different sorts of ways.
  • If that’s the case, then under most decision-making circumstances, there are billions of ways that it could be, so it’s very ambiguous as to whether something is going to be perceived in one way or another.
  • So it’s not an easy task, making these sorts of everyday decisions.
  • We’re going to start with a distinction that Daniel Kahneman makes in his book “Thinking, Fast and Slow,” and we’re going to talk to him in this episode.

Episode 4 – Intuition and Rationality > Thinking fast and slow > Thinking fast and slow

  • Danny Kahneman makes the distinction between system one and system two.
  • I think it’s worth spelling out a little bit exactly the character of those two systems.
  • Another example is just, two plus two equals… You know what I mean? Four just pops into your head. Bread and…butter hopefully pops into your head. Again, the nature of this processing means that it just happens to you.
  • Another example we revisited in episode two is this idea of illusions.
  • You see two squares marked A and B. When you’re looking at the squares, you can’t help but to see one square being darker than the other.
  • So if you’re to actually measure them-we encourage people to do some arts and crafts here-and you can see in fact that the two squares are in fact identical, but you can’t help but to see them as completely different.
  • As we talked about on episode two, people have this conception when it comes to higher order properties like learning and memory that we have some sort of introspective access, that we have some control over these sorts of things, but in fact, as Nisbett said, we actually don’t.
  • Instead of two plus two, we have something like-I don’t know-17 times 24.
  • Yes, it doesn’t exactly come rolling off the tongue, like two plus two does.
  • Another one might be-so a system one property might be recognizing a familiar face, so you can recognize your friend immediately, and you can recognize someone-your mother, for example-in a crowd even.
  • That’s kind of the distinction between system one and system two.
  • Then you’re talking to be about quite complicated scientific processes and principles, and I can tell that your mind is not necessarily on the road but it’s thinking about these things.
  • That’s another facet of system two kind of thinking actually.
  • System two is slow and deliberate but the resources are limited, so when you’re trying to do two complex tasks, which is driving on the other side of the road that you’re used to and explaining a complex phenomenon to me, those two interfere with each oth
  • Danny Kahneman makes the distinction between system one and system two.
  • I think it’s worth spelling out a little bit exactly the character of those two systems.
  • Another example is just, two plus two equals… You know what I mean? Four just pops into your head. Bread and…butter hopefully pops into your head. Again, the nature of this processing means that it just happens to you.
  • Another example we revisited in episode two is this idea of illusions.
  • You see two squares marked A and B. When you’re looking at the squares, you can’t help but to see one square being darker than the other.
  • So if you’re to actually measure them-we encourage people to do some arts and crafts here-and you can see in fact that the two squares are in fact identical, but you can’t help but to see them as completely different.
  • As we talked about on episode two, people have this conception when it comes to higher order properties like learning and memory that we have some sort of introspective access, that we have some control over these sorts of things, but in fact, as Nisbett said, we actually don’t.
  • Instead of two plus two, we have something like-I don’t know-17 times 24.
  • Yes, it doesn’t exactly come rolling off the tongue, like two plus two does.
  • Another one might be-so a system one property might be recognizing a familiar face, so you can recognize your friend immediately, and you can recognize someone-your mother, for example-in a crowd even.
  • That’s kind of the distinction between system one and system two.
  • Then you’re talking to be about quite complicated scientific processes and principles, and I can tell that your mind is not necessarily on the road but it’s thinking about these things.
  • That’s another facet of system two kind of thinking actually.
  • System two is slow and deliberate but the resources are limited, so when you’re trying to do two complex tasks, which is driving on the other side of the road that you’re used to and explaining a complex phenomenon to me, those two interfere with each other.
  • So I think next time we’re in the US, you should keep focused on the road, but maybe when we’re in Australia when your system one is taking control of the wheel, then we can have a chat about more complicated things.
  • You’d like to think that we’d be immune to these sorts of effects.
  • Hopefully-I mean, this distinction between system one and system two is quite important.
  • It assesses your ability to suppress the kind of quick system one response, and your ability to rely on the slow and deliberate system two response.
  • So if you spend a little bit of time thinking about this, you realize that the bat has to cost a dollar and 5 cents, and the ball 5 cents.
  • That’s the only way these two things can add up to 1.10.
  • So as I’ve said, in order to get this question right, you’ve got to really suppress that initial response and think a little bit more about it.
  • It’s the same thing with the other two questions on the cognitive reflection test.
  • I think the second one is if it takes 5 machines 5 minutes to make 5 widgets, how long does it take for 100 machines to create 100 widgets.
  • It’d be great if people could go back to those questions and figure out-use their system two and figure out exactly what the correct response is to these questions.
  • Rationality is what we mean by system two processing, the extent to which people put in the effort to process the information a bit more carefully.
  • I think it’s worth keeping in mind-again, the point of this exercise is to recognize the distinction between system one and system two.
  • Obviously, the grandfather of heuristics and biases and the originator of the distinction between system one and system two is Danny Kahneman.
  • Now we had a conversation with Danny in New York, and he really made that distinction between the two systems.
  • Danny Kahneman makes the distinction between system one and system two.
  • I think it’s worth spelling out a little bit exactly the character of those two systems.
  • Another example is just, two plus two equals… You know what I mean? Four just pops into your head. Bread and…butter hopefully pops into your head. Again, the nature of this processing means that it just happens to you.
  • Another example we revisited in episode two is this idea of illusions.
  • You see two squares marked A and B. When you’re looking at the squares, you can’t help but to see one square being darker than the other.
  • So if you’re to actually measure them-we encourage people to do some arts and crafts here-and you can see in fact that the two squares are in fact identical, but you can’t help but to see them as completely different.
  • As we talked about on episode two, people have this conception when it comes to higher order properties like learning and memory that we have some sort of introspective access, that we have some control over these sorts of things, but in fact, as Nisbett said, we actually don’t.
  • Instead of two plus two, we have something like-I don’t know-17 times 24.
  • Yes, it doesn’t exactly come rolling off the tongue, like two plus two does.
  • Another one might be-so a system one property might be recognizing a familiar face, so you can recognize your friend immediately, and you can recognize someone-your mother, for example-in a crowd even.
  • That’s kind of the distinction between system one and system two.
  • Then you’re talking to be about quite complicated scientific processes and principles, and I can tell that your mind is not necessarily on the road but it’s thinking about these things.
  • That’s another facet of system two kind of thinking actually.
  • System two is slow and deliberate but the resources are limited, so when you’re trying to do two complex tasks, which is driving on the other side of the road that you’re used to and explaining a complex phenomenon to me, those two interfere with each other.
  • So I think next time we’re in the US, you should keep focused on the road, but maybe when we’re in Australia when your system one is taking control of the wheel, then we can have a chat about more complicated things.
  • You’d like to think that we’d be immune to these sorts of effects.
  • Hopefully-I mean, this distinction between system one and system two is quite important.
  • It assesses your ability to suppress the kind of quick system one response, and your ability to rely on the slow and deliberate system two response.
  • So if you spend a little bit of time thinking about this, you realize that the bat has to cost a dollar and 5 cents, and the ball 5 cents.
  • That’s the only way these two things can add up to 1.10.
  • So as I’ve said, in order to get this question right, you’ve got to really suppress that initial response and think a little bit more about it.
  • It’s the same thing with the other two questions on the cognitive reflection test.
  • I think the second one is if it takes 5 machines 5 minutes to make 5 widgets, how long does it take for 100 machines to create 100 widgets.
  • It’d be great if people could go back to those questions and figure out-use their system two and figure out exactly what the correct response is to these questions.
  • Rationality is what we mean by system two processing, the extent to which people put in the effort to process the information a bit more carefully.
  • I think it’s worth keeping in mind-again, the point of this exercise is to recognize the distinction between system one and system two.
  • Obviously, the grandfather of heuristics and biases and the originator of the distinction between system one and system two is Danny Kahneman.
  • Now we had a conversation with Danny in New York, and he really made that distinction between the two systems.

Episode 4 – Intuition and Rationality > Conversation with Daniel Kahneman (Part 1) > Conversation with Daniel Kahneman (Part 1)

  • The title of your book is “Thinking, Fast and Slow,” and you talk about two systems, system one and two.
  • Can you give us an example or tell us a bit about the characters in the book? Well, the characters are indeed system one and system two.
  • System one corresponds to a distinction that everybody recognizes in their own thinking, that there are some thought that just happen to you and there are some thought that you must generate.
  • When I say, “Two plus two,” a number comes to your mind.
  • Sometimes I will stop and choose which word-that’s system two-but most of the time, when I speak the words just come, so that’s system one.
  • System two is-well, there are really two types of operations that system two performs, and one is complex computations.
  • When you are indeed choosing your words carefully because you don’t want to offend, those are situations in which system two is hard at work, and you feel it, so it corresponds… System one and System two really correspond to experiences that are readily available and that everybody recognizes.
  • The dichotomy that you’ve drawn between system one and two, how does that relate to the previous work you’ve done on heuristics and biases? Well, it turns out we had-Amos Tversky and I, when we started our work, we had something in mind that was fairly similar to that.
  • That was the beginning, but we never studied what I now call system two.
  • Then our work became controversial, and people attacked it and criticized it.
  • We pointed out that in his experiments, typically people would see-well, how would I describe it? One of our best-known examples in heuristics, and it’s one of the best examples in the heuristics literature, is the Linda example.
  • Then we asked people how likely it is that Linda now is a bank teller, or how likely it is that she is now a bank teller and is active in the feminist movement.
  • Now there’s no question that when you ask different people those two questions, they will invariably say that it’s more likely that she’s a feminist bank teller rather than a bank teller.
  • When you ask them the two questions to compare the two options, you’re allowing system two to check logic.
  • By priming logical reasoning and by creating some-you can sensitize people so that they will detect that obviously she is more likely to be a bank teller than a feminist bank teller, but that seems to be a different process.
  • When you show them two things together, they can also compare, them and you provide another cue.
  • That was really the background to the distinction between the two systems with the controversy around our work.

Episode 4 – Intuition and Rationality > Pupils and anchors > Pupils and anchors

  • He just mentioned it briefly, but what he was actually measuring-we know that pupils when you are measuring-what you can actually do is film somebody’s pupil, and you can project it onto a wall beside them, for example, and you can measure it literally with a ruler.
  • We can keep doing this and adding digits, and as you try to remember the digits span as you keep adding more and more digits, your pupil just keeps getting larger and larger, until one of two things happens: one, you report the number, so you say, “Okay, 6-4-3-2-7,” and then your pupil constricts again; or you give up.
  • One group he asked, “Was Gandhi older than 140 years old when he died?” Another group he asked, “Was Gandhi older than 9 years old when he died?” These people responded-they were guessing how old Gandhi was when he died-and the people that were given the high anchor of 140 guessed that Gandhi died at 67 years old.
  • Now you may think that, “Well, that’s kind of reasonable if an experimenter or someone, they might have some inside information about the correct answer to the question, and so it’s quite reasonable to anchor your decision based on a number that they say,” but it can’t be working like that because there’s another great example by Dan Ariely.
  • He asked participants to write down their social security number, just the last two digits, and then he split them into two groups.
  • If their social security number is higher than 50, they go to this group; lower than 50, in other group.
  • Then he asked them, “How much would you bid on these things, on these bottles of wine and these chocolate?” Now he found that people who had a high social security number, greater than 50, for example, they were willing to pay way more for these things than people who wrote down a low social security number before the experiment.
  • You can see that this arbitrary writing down of a number is influencing their decisions.
  • I think people with a high social security number were willing to pay 60 to 120 percent more for these things than the low social security number, which I think is pretty cool.
  • When you’re in this anchoring position, if you’re given any number, any number, obviously, even random numbers, the same thing happens with the roll of a roulette wheel.
  • If a number that comes up, say 10, then you use that as your anchor, regardless of whatever random process generated it-but under most conditions it’s not random.
  • Under most conditions, when we’re operating in the world, a number appears, that’s something to start with.
  • In complex scenarios, if you have no idea, for example, of the percentage of African countries in the UN, or the population of Australia, say, you have to start somewhere, and any number that you have is better than no number at all.

Episode 4 – Intuition and Rationality > Availability > Availability

  • People might remember from the list there was Brad Pitt, Tom Cruise, Steve Jobs, people-faces that most everyone in the course would recognize.
  • Now the thinking here with this sort of experiment-it’s been done time and time again-but the idea here is that my bet would be that most people would have remembered the 12 males that were in that original list-maybe not explicitly, but they may have remembered them-and when they were thinking back to the list of males, these people may have stood out.
  • The idea here is that it’s that ease of processing, that ease of cognitive processing when thinking back to the males that went down a little bit easier, and so people would misinterpret that ease of processing for the category being larger than the female category.
  • Are people actually remembering? Do they have a list of the celebrities in their heads? Do they remember Brad Pitt? Is that on internal list? Not necessarily.
  • Now even without having any of those words come to mind, people will quickly recognize that the letter string at the bottom will produce more words than the letter string at the top.
  • People misinterpret the ease of processing which could be due to any number of reasons as being indicative of the larger category.
  • We don’t hear much about the people who died that night on the 6 PM news of asthma or heart disease.
  • I’d be willing to bet that people would pay far higher insurance premiums to protect themselves from the things that they hear a lot in the media about how people die, versus what the base rates might actually show the people are most likely to die of.
  • Of the time, as our memory fades of that event, I think that willingness to pay those high premiums will drop, and people will be far less likely over time to pay a lot for flood insurance.
  • If you were to ask people how likely someone would die from a shark attack, they would say that it’s way more likely than it actually is, at the expense of things like heart disease and so on, which people really underestimate because they never hear about them.
  • If there’s a relatively minor event that happens-say, a tremor or something like that-and you have this news agency who blows it out of proportion, makes it larger than it actually is, then people start to freak out a little bit more, which then feeds more coverage, which results in people freaking out a little bit more.
  • He talks about these availability entrepreneurs, these people in these news agencies that make a living out of doing this.

Episode 4 – Intuition and Rationality > Linda the feminist bank teller > Linda the feminist bank teller

  • Now Danny Kahneman already introduced us to this character called Linda the bank teller, and Linda is described as very outgoing and bright.
  • Now when you ask people, “Is Linda a bank teller or a feminist bank teller,” people are way more likely to report that Linda is a feminist bank teller, even though just thinking about the base rates and the probability, there are way more bank tellers than there are feminist bank tellers.
  • So what the Linda example sets up is-kind of conflicts between probability and base rates on one hand, what is actually true, versus representativeness, on the other.
  • The description of Linda being so representative of a feminist sort of pushes the probability down and were more likely to respond that Linda is a feminist bank teller.
  • When I walk into a store, I can always tell who works there 99 percent of the time.
  • So Danny Kahneman and Amos Tversky had to create this Linda problem, so she really fit the mold of a feminist bank teller, to kind of trick people in the sense to fall in to this mistake.
  • What we’re going to do now is present another example, one from, again, Danny Kahneman and Amos Tversky, that they came up with, where we talk about Rudy who’s in a similar sort of vein as Linda the famous feminist bank teller.

Episode 4 – Intuition and Rationality > Becoming an expert > Becoming an expert

  • Now let’s go back to this distinction between system one and system two.
  • Now earlier on, you gave us a few examples that come effortlessly to us “Two plus two equals” appears, and recognizing an angry face just sort of happens.
  • There are groups of people that we call experts who can do some pretty amazing feats effortlessly, tasks that you and I might find difficult as novices.
  • Chess experts can identify and remember just thousands of chess moves, and they can quickly decide what the next move is way better than novices.
  • Radiologists, medical examiners, can put two mammograms, two breast scans, side by side on the screen, and in just a blink of an eye, can tell you whether a person has cancer or not.
  • Again, if you put two fingerprints side by side on the screen to a fingerprint expert, they can, in a blink of an eye, tell you whether those prints came from the same person or two different people.
  • When you’re talking about chess experts or diagnosticians, these things which were system two is now system one.
  • Now the same thing happens when you’re learning how to drive for the first time.
  • If I were to measure the pupil dilation of a novice driver, imagine all of the things that they have to pay attention to when you’re learning how to drive, particularly if it’s a manual transmission: the friction point of the clutch; you have where the pedestrians are, where your indicators are, everything else.
  • If I were to measure the pupil size of the novice driver, they’d be like saucers, wouldn’t they? But as you learn how to drive, these things get easier and easier as you accumulate hours at the wheel.
  • If you were to measure the size of the pupils of the expert driver, they’d be just tiny by comparison to the novice driver.
  • I think that’s a really nice example of, again, the difference between system one and system two and how that develops as you develop expertise.

Episode 4 – Intuition and Rationality > Conversation with Daniel Kahneman (Part 2) > Conversation with Daniel Kahneman (Part 2)

  • I mean, there’s nothing magical about 10,000, and I’m sure that it doesn’t take the same amount of time to different people, and expertise is not wholly defined and so on, but it gives you an idea that this is a lot of hours, that to become an expert where you see that qualitative change in the way things are done, where basically performance switches from what I call system two to system one, that takes a long time.
  • One of the goals of the course is to cue people to the difference between people who are actual experts and people who simply just claim to be experts.
  • Is there anything that people should watch out for, any red flags to tell the difference between people who actually know or can actually do what they claim for themselves? Yes, I mean, I think-Gary Klein and I wrote a paper in which we actually suggested and then said- it’s embarrassingly simple-but when somebody acts like a self-confident expert on a range of problems, then there’s one question to be asked: did that person have a decent opportunity to learn how to perform the task? That requires getting feedback on the quality of performance and getting rapid and unequivocal feedback.
  • So if somebody wanted to become an expert at a new task, what’s the fastest and most efficient way to turn, as you said, that system two, that effortful sort of processing, into system one? Well, there are really two ways of doing this, and you have to use both.
  • For somebody to become an expert driver, you have to tell them how to drive.
  • I would say for somebody to become an expert diagnostician on the basis of X-rays, you have to teach them what the things look like so that they’ll be able to recognize them.
  • Merely telling people how to do something is not going to turn them into experts, and repeatedly telling them the same thing is not going to help.
  • I mean, do not expect that you can generally increase the quality of your thinking because I think you really cannot, but if there are repetitive mistakes that you are prone to make, if you learn the cues, the situations in which you make that mistake, then maybe you can learn to eliminate them.
  • I mean, people feel great when they hear of all these ways of doing things and of controlling themselves, but then when they are making a mistake they are so busy making it that they have no time to correct it.
  • One piece of advice, by the way, is that recognize situations where you can’t do it alone, where you need a friend, where you need advice because if you do it alone you are going to make a mistake.

Episode 4 – Intuition and Rationality > In a nutshell > In a nutshell

  • I think it’s important that we have been spending a little bit of time busting myths about how we intuitively think that the mind works, but we need to replace it with a more realistic version of how the mind might actually be working.
  • Here we’ve introduced the two systems: system one and system two.
  • I think this idea of system one and two is really useful metaphor, as Danny Kahneman points out.
  • If you were to look in the brain and find system one and system two, you wouldn’t find it.
  • I mean, it doesn’t really make sense to talk about the two systems as though you can find them neurologically, as though they’re a thing.
  • I think it’s useful to think about system one and two in terms of these characters because it reframes the way we think about dealing with these kind of issues.
  • We also talked about it with respect to expertise and the development of expertise, turning the system to the deliberate process into system one eventually.

Episode 4 – Intuition and Rationality > Uncut conversation with Daniel Kahneman > Uncut conversation with Daniel Kahneman

  • When people are engaged in a task-you assign a multiplication task-the pupil dilates, and it stays steady as a rock-hippus is gone-so the measurement noise is eliminated.
  • Measurement noise is eliminated when people are engaged in a task, so it is more sensitive than the other autonomic indices.
  • The title of your book is “Thinking, Fast and Slow,” and you talk about two systems, system one and two.
  • That distinction is obvious, and people recognize it.
  • When you describe system one and system two and there are agents that do things, people find it easy to understand, compelling and interesting, and system one and system two develop personalities.
  • You mentioned the difference between system one as being things that happen to you and system two are things that you do.
  • System two is-well, there are really two types of operations that system two performs, and one is complex computations.
  • When you are indeed choosing your words carefully because you don’t want to offend, those are situations in which system two is hard at work, and you feel it, so it corresponds… System one and System two really correspond to experiences that are readily available and that everybody recognizes.
  • That distinction between something happening and something that you do is, I think, pretty compelling to most people.
  • We were interested in intuitive statistics, so in estimates that come to people’s mind about probabilities and so on.
  • So in our very first paper, we distinguished intuition from computation, and our point was that intuition is in some cases surprisingly error-prone and that people should rely on computation.
  • Then our work became controversial, and people attacked it and criticized it.
  • There was something that essentially all the criticisms and all the experimental criticisms of our work had in common, in that they created a situation in which people could figure out the answer by working on it.
  • We pointed out that in his experiments, typically people would see-well, how would I describe it? One of our best-known examples in heuristics, and it’s one of the best examples in the heuristics literature, is the Linda example.
  • Then we asked people how likely it is that Linda now is a bank teller, or how likely it is that she is now a bank teller and is active in the feminist movement.
  • Now there’s no question that when you ask different people those two questions, they will invariably say that it’s more likely that she’s a feminist bank teller rather than a bank teller.
  • By priming logical reasoning and by creating some-you can sensitize people so that they will detect that obviously she is more likely to be a bank teller than a feminist bank teller, but that seems to be a different process.
  • When people see only one example, they evaluate the fit of that example.
  • I mean, there’s nothing magical about 10,000, and I’m sure that it doesn’t take the same amount of time to different people, and expertise is not wholly defined and so on, but it gives you an idea that this is a lot of hours, that to become an expert where you see that qualitative change in the way things are done, where basically performance switches from what I call system two to system one, that takes a long time.
  • One of the goals of the course is to cue people to the difference between people who are actual experts and people who simply just claim to be experts.
  • Is there anything that people should watch out for, any red flags to tell the difference between people who actually know or can actually do what they claim for themselves? Yes, I mean, I think-Gary Klein and I wrote a paper in which we actually suggested and then said- it’s embarrassingly simple-but when somebody acts like a self-confident expert on a range of problems, then there’s one question to be asked: did that person have a decent opportunity to learn how to perform the task? That requires getting feedback on the quality of performance and getting rapid and unequivocal feedback.
  • So if somebody wanted to become an expert at a new task, what’s the fastest and most efficient way to turn, as you said, that system two, that effortful sort of processing, into system one? Well, there are really two ways of doing this, and you have to use both.
  • Merely telling people how to do something is not going to turn them into experts, and repeatedly telling them the same thing is not going to help.
  • I mean, people feel great when they hear of all these ways of doing things and of controlling themselves, but then when they are making a mistake they are so busy making it that they have no time to correct it.
  • Now that’s hard, and obviously, as you mentioned, trying to get people to be motivated enough to engage in system two-well, actually, a lot of people have the tools and have everything they need in order to make better decisions, in order to learn a new task, but it’s just a matter of putting in that cognitive effort to doing a little bit of putting in some elbow grease and actually making that happen.
  • He finds some people are more rational but-not particularly irrational, although they’re intelligent or vice versa.
  • So you’ve got to create many opportunities for people to bump into each other so that they can exchange ideas.
  • You’ve got to allow-to encourage exchange of ideas between people who are not in the same field.
  • You know, Steve Jobs was famous for the suggestion of having very few restrooms in the building to force people from different units to meet each other on their way to the restroom or there.
  • I think people have no idea what the future will be, and I’m no exception, so I have really no interesting forecast.
  • Do you think that’s a fruitful sort of enterprise, the merger of the two? I’ve always been a believer of there are some people who are by nature skeptics and other people who are by nature sort of believers and gullible.
  • We’re presenting students with the cognitive reflection task and asking them to-we’re giving them-which should be interesting with 200,000 people who are taking the course-to see what the difference is between fast and slow.
  • People who understand anchoring, who understand availability, another word that-“What you see is all there is” has limited currency, but it has some currency.
  • It’s really introducing terms that make it easier for people to see certain phenomena.
  • You’ve mentioned, for example, Keith Stanovich’s conception of the cognitive reflection tasks, so potentially seeing a change, obviously not on exactly the same questions but at the beginning and the end-maybe a drop in belief in the paranormal, maybe an increase in the cognitive personality tasks, to see the need for cognition or something, people want to think more or something at the end of the course.
  • Can you think of another benchmark that might help to gauge whether people are doing-are thinking more? This is very ambitious, what you’re trying to do.
  • So if ideally you’d want people to-you’d want as an exercise: “Here is a mistake I made today,” or, “Here is a mistake I almost made today,” so you’d want to make people introspect it.
  • The easier by far task is to make people critical of other people.
  • If you improve-my thought has always been, you know, I’ve said that the aim of the book is to educate gossip, really, and that is because I believe that if you train people to be good critics of other people thinking and decision-making, eventually they will turn that on themselves.
  • This is the easiest way of doing it, rather than making people do something that is inherently quite aversive, which is monitor themselves and criticize themselves as they go along.

Return to Summaries List.

(image source)
Print Friendly, PDF & Email