1 00:00:00,000 --> 00:00:04,710 The following content is provided by MIT OpenCourseWare 2 00:00:04,710 --> 00:00:07,199 under a Creative Commons license. 3 00:00:07,199 --> 00:00:10,185 Additional information about our license, and MIT 4 00:00:10,185 --> 00:00:15,162 OpenCourseWare in general, is available at ocw.mit.edu. 5 00:00:15,162 --> 00:00:15,660 PROFESSOR: 6 00:00:15,660 --> 00:00:18,130 Good afternoon. 7 00:00:18,130 --> 00:00:23,590 Before I forget, today's topic is cognition. 8 00:00:23,590 --> 00:00:28,210 And in part, a good chunk of this lecture is going to be 9 00:00:28,210 --> 00:00:34,650 based on work done by Kahneman and Tversky, who won the Nobel 10 00:00:34,650 --> 00:00:38,110 Prize for a bunch of this work some years back. 11 00:00:38,110 --> 00:00:41,070 A couple of years back, now. 12 00:00:41,070 --> 00:00:44,450 The reason that is of particular interest is that 13 00:00:44,450 --> 00:00:48,680 Danny Kahneman is here tomorrow speaking. 14 00:00:48,680 --> 00:00:51,810 E-51, 4 o'clock, right? 15 00:00:51,810 --> 00:00:55,890 So if you happen to be a, interested in the material and 16 00:00:55,890 --> 00:00:59,490 b, at leisure at 4 o'clock tomorrow, I'd go hear him. 17 00:00:59,490 --> 00:01:04,880 He's a very good speaker and there aren't that many folks 18 00:01:04,880 --> 00:01:06,830 who I get to talk about in psychology 19 00:01:06,830 --> 00:01:08,380 who win Nobel Prizes. 20 00:01:08,380 --> 00:01:10,730 So, you should go hear him. 21 00:01:13,730 --> 00:01:16,200 That's one characteristic of today's lecture. 22 00:01:16,200 --> 00:01:18,860 I realize that the other characteristic of today's 23 00:01:18,860 --> 00:01:25,650 lecture is that I get to tell you a bunch of things that are 24 00:01:25,650 --> 00:01:31,000 applicable to the forthcoming election. 25 00:01:31,000 --> 00:01:34,120 And to politics in general. 26 00:01:34,120 --> 00:01:39,210 This particular swath I'm cutting through, the large 27 00:01:39,210 --> 00:01:45,330 area of cognition, includes a bunch of topics that if you've 28 00:01:45,330 --> 00:01:49,020 watched the debates thus far, or if you go and watch the 29 00:01:49,020 --> 00:01:53,460 debate that's Friday, right, the next one's Friday evening, 30 00:01:53,460 --> 00:01:56,510 you'll be able to see some of the stuff I'm talking about 31 00:01:56,510 --> 00:02:00,960 today in action. 32 00:02:00,960 --> 00:02:04,510 There are two large parts to what I want to talk about. 33 00:02:04,510 --> 00:02:09,480 And this is described, at least in rough detail, in this 34 00:02:09,480 --> 00:02:12,720 abstract at the top of the handout. 35 00:02:12,720 --> 00:02:18,350 We can make the distinction between the way, using the 36 00:02:18,350 --> 00:02:22,410 computer as a model for how the brain works is sort of a 37 00:02:22,410 --> 00:02:25,380 common game, rather like using the camera as a 38 00:02:25,380 --> 00:02:26,730 model for the eye. 39 00:02:26,730 --> 00:02:29,020 There are important differences. 40 00:02:29,020 --> 00:02:32,020 One of the important differences that I'll talk 41 00:02:32,020 --> 00:02:35,330 about, that I started talking about last time, is that if 42 00:02:35,330 --> 00:02:41,820 you call up a document off your laptop, you are expecting 43 00:02:41,820 --> 00:02:45,140 that thing pulled out of memory to be exactly what you 44 00:02:45,140 --> 00:02:47,220 put into memory. 45 00:02:47,220 --> 00:02:52,430 As we saw, as we saw towards the end of the last lecture, 46 00:02:52,430 --> 00:02:55,020 that's not true about human memory. 47 00:02:55,020 --> 00:02:57,430 I'll elaborate on that a bit. 48 00:02:57,430 --> 00:03:02,110 The other thing you expect is that your computer will do 49 00:03:02,110 --> 00:03:04,290 mathematical and logical operations 50 00:03:04,290 --> 00:03:07,180 correctly and quickly. 51 00:03:07,180 --> 00:03:10,600 And that also turns out to be not particularly 52 00:03:10,600 --> 00:03:15,130 characteristic of human cognitive behavior. 53 00:03:15,130 --> 00:03:17,530 For interesting reasons that I will describe in the second 54 00:03:17,530 --> 00:03:18,560 part of the lecture. 55 00:03:18,560 --> 00:03:22,910 So let's start with this issue, this point, that says 56 00:03:22,910 --> 00:03:28,350 that recall is not just taking stuff out of storage. 57 00:03:28,350 --> 00:03:33,040 Last time, I talked about a couple of examples of that. 58 00:03:33,040 --> 00:03:36,520 And those are the ones labeled as old on the handout. 59 00:03:36,520 --> 00:03:38,790 There's the context of remembering. 60 00:03:38,790 --> 00:03:44,940 If you are depressed, your memories of childhood are more 61 00:03:44,940 --> 00:03:48,390 gloomy than if you are in a good mood. 62 00:03:48,390 --> 00:03:52,270 There is retriever's bias, that was the example I was 63 00:03:52,270 --> 00:03:59,930 giving last time, with these experiments where the knife in 64 00:03:59,930 --> 00:04:03,410 a simulated altercation that you see, the knife might move 65 00:04:03,410 --> 00:04:07,070 from one person's hand to another person's hand in your 66 00:04:07,070 --> 00:04:10,320 memory, depending on the particular biases and 67 00:04:10,320 --> 00:04:12,680 associations in your memory. 68 00:04:12,680 --> 00:04:18,520 Now, let me tell you about a few more ways in which memory 69 00:04:18,520 --> 00:04:22,560 is not just like pulling a document off 70 00:04:22,560 --> 00:04:24,850 the hard drive somewhere. 71 00:04:24,850 --> 00:04:28,700 One of these is what I've labeled on the handout as 72 00:04:28,700 --> 00:04:31,830 failures of source monitoring. 73 00:04:31,830 --> 00:04:35,950 Source monitoring is what we're busy making sure you do 74 00:04:35,950 --> 00:04:37,750 correctly on your papers. 75 00:04:37,750 --> 00:04:42,090 Telling us where the information comes from. 76 00:04:42,090 --> 00:04:45,390 And evaluating the information on the basis of 77 00:04:45,390 --> 00:04:46,740 where it comes from. 78 00:04:46,740 --> 00:04:53,840 It turns out that that's not as easy to do as one might 79 00:04:53,840 --> 00:04:55,030 think it ought to be. 80 00:04:55,030 --> 00:04:58,160 Indeed, I got this quote from Euripedes on the handout that 81 00:04:58,160 --> 00:05:02,180 says man's most valuable trait is a judicious sense of what 82 00:05:02,180 --> 00:05:03,680 not to believe. 83 00:05:03,680 --> 00:05:09,480 If you hear something that's clearly false, is it possible 84 00:05:09,480 --> 00:05:12,640 for you to not be influenced by that. 85 00:05:12,640 --> 00:05:14,900 This turns out to have been a question that philosophers 86 00:05:14,900 --> 00:05:15,970 have worried about. 87 00:05:15,970 --> 00:05:18,160 Descartes thought that the answer to the 88 00:05:18,160 --> 00:05:19,440 question was yes. 89 00:05:19,440 --> 00:05:24,490 If you hear a statement that's clearly false, you can label 90 00:05:24,490 --> 00:05:28,470 it as clearly false and, in a sense, put it aside. 91 00:05:28,470 --> 00:05:31,530 Spinoza, in contrast, thought that it would be very nice if 92 00:05:31,530 --> 00:05:32,140 that were true. 93 00:05:32,140 --> 00:05:34,410 But he doubted that it really was. 94 00:05:34,410 --> 00:05:38,690 This has subsequently been tested experimentally. 95 00:05:38,690 --> 00:05:41,280 And I don't have enough colored chalk around, but I'll 96 00:05:41,280 --> 00:05:42,450 simulate it. 97 00:05:42,450 --> 00:05:45,170 Here's what Dan Gilbert and his colleagues 98 00:05:45,170 --> 00:05:47,850 did a few years back. 99 00:05:47,850 --> 00:05:50,810 They did an experiment where you read little 100 00:05:50,810 --> 00:05:54,260 biographies of people. 101 00:05:54,260 --> 00:06:00,600 And the material you're reading is color-coded. 102 00:06:00,600 --> 00:06:03,100 And so you're reading along. 103 00:06:03,100 --> 00:06:05,920 And then some of it, here, we'll use straight lines for a 104 00:06:05,920 --> 00:06:06,840 different color. 105 00:06:06,840 --> 00:06:09,410 Some of it is red. 106 00:06:09,410 --> 00:06:11,520 If it's red it's a lie. 107 00:06:11,520 --> 00:06:12,920 Don't believe it. 108 00:06:12,920 --> 00:06:16,330 But if it's green, it's true. 109 00:06:16,330 --> 00:06:20,670 And if it's red it's a lie, and true, and so on. 110 00:06:20,670 --> 00:06:22,260 Read a paragraph of this stuff. 111 00:06:22,260 --> 00:06:24,170 We're going to make sure that you've read it because we're 112 00:06:24,170 --> 00:06:26,770 going to test you afterwards and make sure that you 113 00:06:26,770 --> 00:06:28,430 actually know what it said. 114 00:06:28,430 --> 00:06:32,670 Because it's very boring to discover that you can ignore 115 00:06:32,670 --> 00:06:35,700 lies if you don't ever hear the lies. 116 00:06:35,700 --> 00:06:39,620 But we know you've heard these various lies. 117 00:06:39,620 --> 00:06:43,530 And then what we'll do is ask you about this person. 118 00:06:43,530 --> 00:06:48,090 The metric they were using was a scale between college 119 00:06:48,090 --> 00:06:50,980 student and criminal, I believe. 120 00:06:50,980 --> 00:06:54,150 And you were asked how criminal you thought these 121 00:06:54,150 --> 00:06:55,660 particular people. 122 00:06:55,660 --> 00:06:56,570 So you're reading along. 123 00:06:56,570 --> 00:07:01,520 And so-and-so is taking 1803. 124 00:07:01,520 --> 00:07:05,410 And steals calculators. 125 00:07:05,410 --> 00:07:10,150 And does the problem sets reasonably reliably. 126 00:07:10,150 --> 00:07:12,090 And then fences the calculators. 127 00:07:12,090 --> 00:07:15,790 And so on, right? 128 00:07:15,790 --> 00:07:21,220 The result is that even though you know that the statements 129 00:07:21,220 --> 00:07:28,100 are false, your opinion of this hypothetical person is 130 00:07:28,100 --> 00:07:30,870 lowered by negative lies. 131 00:07:30,870 --> 00:07:33,770 And, reciprocally, raised by positive lies. 132 00:07:33,770 --> 00:07:42,780 You can't completely discount the information that you're 133 00:07:42,780 --> 00:07:45,630 hearing, even if you know it's a lie. 134 00:07:45,630 --> 00:07:48,960 Now, if it is not immediately obvious to you that this has 135 00:07:48,960 --> 00:07:52,860 direct relevance to the current political campaign, 136 00:07:52,860 --> 00:07:55,520 you can ask yourself about the behavior you can see in this 137 00:07:55,520 --> 00:07:58,890 campaign, or in any other campaign that you ever care to 138 00:07:58,890 --> 00:08:00,430 subject yourself to. 139 00:08:00,430 --> 00:08:07,250 Which is, you get charges that are made, typically by 140 00:08:07,250 --> 00:08:10,360 surrogates of the candidate. 141 00:08:10,360 --> 00:08:14,500 So a clear example, this here would be the charges about 142 00:08:14,500 --> 00:08:19,850 Kerry's Vietnam behavior, brought by the Swift Boat 143 00:08:19,850 --> 00:08:23,920 people, not directly by the Bush campaign. 144 00:08:23,920 --> 00:08:25,640 And I should be able to think of a flip side. 145 00:08:25,640 --> 00:08:29,270 Oh, the National Guard stuff for Bush. 146 00:08:29,270 --> 00:08:31,450 Regardless of the truth, we're not going to get into the 147 00:08:31,450 --> 00:08:32,500 truth assertion of this. 148 00:08:32,500 --> 00:08:36,550 But, in both cases the really nasty attacks are raised by 149 00:08:36,550 --> 00:08:37,990 surrogates. 150 00:08:37,990 --> 00:08:45,720 If it is then proved to be a lie, then the candidate can 151 00:08:45,720 --> 00:08:52,000 have the great and noble opportunity to run around the 152 00:08:52,000 --> 00:08:58,780 country saying, look, people should not be saying that John 153 00:08:58,780 --> 00:09:01,060 Kerry molests small animals. 154 00:09:01,060 --> 00:09:03,280 It's a lie that he molests small animals. 155 00:09:03,280 --> 00:09:07,200 Anybody who says that should given a dope slap. 156 00:09:07,200 --> 00:09:09,850 And I would never say anything like, John Kerry 157 00:09:09,850 --> 00:09:11,210 molests small animals. 158 00:09:11,210 --> 00:09:15,470 And I will tell all of my people not to say that John 159 00:09:15,470 --> 00:09:17,080 Kerry molests small animals. 160 00:09:17,080 --> 00:09:21,550 And you can go home with this righteous feeling around you 161 00:09:21,550 --> 00:09:23,100 saying that I've done the right thing of 162 00:09:23,100 --> 00:09:24,560 condemning this smear. 163 00:09:24,560 --> 00:09:27,930 Not only that, I've spread it around a bit, and the dumb 164 00:09:27,930 --> 00:09:29,970 suckers are going to believe it. 165 00:09:29,970 --> 00:09:32,550 It's not going to swing the election one way the other, 166 00:09:32,550 --> 00:09:34,080 but it's going to push things. 167 00:09:34,080 --> 00:09:39,520 And political campaigns do this absolutely routinely. 168 00:09:39,520 --> 00:09:42,420 Perhaps not out of a completely Machiavellian 169 00:09:42,420 --> 00:09:45,100 understanding of the psychology, but because they 170 00:09:45,100 --> 00:09:47,180 know it works in some fashion. 171 00:09:47,180 --> 00:09:48,820 I don't know that they're sitting around reading Dan 172 00:09:48,820 --> 00:09:54,590 Gilbert's papers to figure this sort of thing out. 173 00:09:54,590 --> 00:09:57,960 This has applications elsewhere in your life, too. 174 00:09:57,960 --> 00:10:00,970 It's very important to understand -- 175 00:10:00,970 --> 00:10:05,230 I'm going to take zinc because it's going to keep me from 176 00:10:05,230 --> 00:10:06,640 getting colds. 177 00:10:06,640 --> 00:10:08,190 Now, why do I think that? 178 00:10:08,190 --> 00:10:14,150 Is that because my aunt June, who's the medical wacko of the 179 00:10:14,150 --> 00:10:15,870 family thinks that's true? 180 00:10:15,870 --> 00:10:18,290 Or is that because I read it in the Journal of the American 181 00:10:18,290 --> 00:10:19,510 Medical Association? 182 00:10:19,510 --> 00:10:22,450 It makes a difference where the information comes from. 183 00:10:22,450 --> 00:10:24,600 So, some sort of an ability to evaluate 184 00:10:24,600 --> 00:10:25,810 the source is important. 185 00:10:25,810 --> 00:10:28,260 And the fact that we're not very good at it is 186 00:10:28,260 --> 00:10:29,530 problematic. 187 00:10:29,530 --> 00:10:32,850 A particular subset of the problems with source 188 00:10:32,850 --> 00:10:36,400 monitoring is what's known as [? crypt ?] amnesia. 189 00:10:36,400 --> 00:10:40,140 And that's the problem where you don't remember where your 190 00:10:40,140 --> 00:10:42,750 own ideas come from. 191 00:10:42,750 --> 00:10:50,090 The canonical form of this, in a setting like MIT -- 192 00:10:50,090 --> 00:10:54,520 actually, this is the part of the lecture for the TAs. 193 00:10:54,520 --> 00:10:57,990 The canonical version of this is you go to your doctoral 194 00:10:57,990 --> 00:11:01,320 adviser with a bright idea. 195 00:11:01,320 --> 00:11:03,740 Yeah, I got this cool idea for new experiment. 196 00:11:03,740 --> 00:11:07,210 Your doctoral adviser looks at you, says, that's stupid. 197 00:11:07,210 --> 00:11:08,020 And you go away. 198 00:11:08,020 --> 00:11:10,490 You know, you're kind of devastated. 199 00:11:10,490 --> 00:11:12,710 Next week, in lab meeting. 200 00:11:12,710 --> 00:11:16,910 Your doctoral adviser has this really bright new idea. 201 00:11:16,910 --> 00:11:18,440 It's your idea. 202 00:11:18,440 --> 00:11:20,630 He thinks it's his idea. 203 00:11:20,630 --> 00:11:22,940 He does not, he's being a -- 204 00:11:22,940 --> 00:11:25,550 probably not, being a slimeball to say I'm going to 205 00:11:25,550 --> 00:11:28,470 steal their ideas. 206 00:11:28,470 --> 00:11:31,750 It's, you actually forget where it comes from. 207 00:11:31,750 --> 00:11:34,960 And in the absence of a clear memory of where the idea came 208 00:11:34,960 --> 00:11:39,070 from, if I've got an idea in my head about my research, 209 00:11:39,070 --> 00:11:40,470 gee, I wonder where that came from. 210 00:11:40,470 --> 00:11:41,320 It came from -- 211 00:11:41,320 --> 00:11:42,990 I think they come from my shower head. 212 00:11:42,990 --> 00:11:44,490 Because all the good ideas I have 213 00:11:44,490 --> 00:11:45,450 always come in the shower. 214 00:11:45,450 --> 00:11:48,710 But, most people tend to think it comes from them. 215 00:11:48,710 --> 00:11:51,620 I just cite the shower. 216 00:11:51,620 --> 00:11:55,960 But this is actually a very good argument, by the way, for 217 00:11:55,960 --> 00:11:59,450 taking copious notes in a lab book when you work in a. 218 00:11:59,450 --> 00:12:03,730 Lab Because if it ever gets to the point of an argument, you 219 00:12:03,730 --> 00:12:06,740 want to be able to say, look, in this dated entry from 220 00:12:06,740 --> 00:12:09,170 October 5th, 10,000 and -- 221 00:12:09,170 --> 00:12:15,830 2004, it might take you a long time to get your doctorate. 222 00:12:15,830 --> 00:12:17,070 here's the idea. 223 00:12:17,070 --> 00:12:19,250 I wrote down this idea, and then I wrote down this 224 00:12:19,250 --> 00:12:22,950 tear-stained comment where you told me I was a doofus. 225 00:12:22,950 --> 00:12:25,940 But, anyway, I really do think this was my idea. 226 00:12:25,940 --> 00:12:28,210 And I ought to be an author on the paper that you just 227 00:12:28,210 --> 00:12:30,290 submitted about it. 228 00:12:30,290 --> 00:12:31,750 Anyway, it's a serious problem. 229 00:12:31,750 --> 00:12:36,760 Another place where that shows up is there are recurring 230 00:12:36,760 --> 00:12:40,750 scandals in the sort of academic book-publishing world 231 00:12:40,750 --> 00:12:44,810 where some well-respected author is proven to have taken 232 00:12:44,810 --> 00:12:47,650 a chunk out of somebody else's book. 233 00:12:47,650 --> 00:12:49,880 Or something very close to a chunk out of somebody else 234 00:12:49,880 --> 00:12:50,870 book, and publishing it. 235 00:12:50,870 --> 00:12:52,360 Some people are doing that because they're 236 00:12:52,360 --> 00:12:55,190 sloppy, slimy people. 237 00:12:55,190 --> 00:12:57,400 Other people are doing it because you 238 00:12:57,400 --> 00:12:59,730 read a lot of stuff. 239 00:12:59,730 --> 00:13:01,490 Maybe you took good notes, maybe you didn't. 240 00:13:01,490 --> 00:13:05,450 And then when you start writing, this clever phrase 241 00:13:05,450 --> 00:13:07,380 comes to you -- or maybe just a pedestrian 242 00:13:07,380 --> 00:13:08,690 phrase comes to mind. 243 00:13:08,690 --> 00:13:10,050 But a phrase comes to mind. 244 00:13:10,050 --> 00:13:13,590 And you simply have forgotten the source of that phrase. 245 00:13:13,590 --> 00:13:15,670 Which is that you got it from somebody else's 246 00:13:15,670 --> 00:13:17,400 writing, and then -- 247 00:13:17,400 --> 00:13:21,370 it's a strong argument for making sure that you keep very 248 00:13:21,370 --> 00:13:23,830 clear track of where you're getting your material from. 249 00:13:23,830 --> 00:13:26,590 And it's part of why we beat so hard on the notion of 250 00:13:26,590 --> 00:13:31,980 referencing in papers in a course like this. 251 00:13:31,980 --> 00:13:35,050 That's what [? crypt ?] 252 00:13:35,050 --> 00:13:39,600 amnesia is, in any case. 253 00:13:39,600 --> 00:13:42,230 So, those are failures of source monitoring. 254 00:13:42,230 --> 00:13:46,590 It is also possible, to move on to the next one, to have 255 00:13:46,590 --> 00:13:49,980 memories that feel like memories but 256 00:13:49,980 --> 00:13:53,480 just are not true. 257 00:13:53,480 --> 00:13:56,600 Again, Liz Loftus has done some of the most interesting 258 00:13:56,600 --> 00:13:59,910 research on this. 259 00:13:59,910 --> 00:14:03,450 One of her experiments, another one of these car crash 260 00:14:03,450 --> 00:14:06,130 experiments, remember the experiment of, how fast did 261 00:14:06,130 --> 00:14:10,220 the car smash, or bump, or tap, or whatever it was, into 262 00:14:10,220 --> 00:14:11,130 the other car? 263 00:14:11,130 --> 00:14:12,420 Well, here's another one. 264 00:14:12,420 --> 00:14:17,150 You watch another one of these videos of a car getting into 265 00:14:17,150 --> 00:14:19,310 an accident. 266 00:14:19,310 --> 00:14:21,740 And that part of the problem is that the car went straight 267 00:14:21,740 --> 00:14:23,660 through a yield sign. 268 00:14:23,660 --> 00:14:27,810 Later, you are given information that suggests that 269 00:14:27,810 --> 00:14:29,950 it might have been a stop sign. 270 00:14:29,950 --> 00:14:32,760 Can do this in a variety of ways, including ways that tell 271 00:14:32,760 --> 00:14:35,750 you that that is bad information. 272 00:14:35,750 --> 00:14:39,040 Another source monitoring problem. 273 00:14:39,040 --> 00:14:43,390 But, given later information that it was a stop sign, not a 274 00:14:43,390 --> 00:14:47,990 yield sign, people will incorrectly remember it -- 275 00:14:47,990 --> 00:14:50,350 not everybody, but you will get a percentage of people -- 276 00:14:50,350 --> 00:14:54,780 who will now misremember the original scene as having had a 277 00:14:54,780 --> 00:14:56,690 stop sign in it. 278 00:14:56,690 --> 00:15:01,250 If you ask what color was the sign, they'll say red, even 279 00:15:01,250 --> 00:15:05,090 though a yield sign would be yellow, and was yellow in the 280 00:15:05,090 --> 00:15:07,700 in the actual video that you saw. 281 00:15:07,700 --> 00:15:11,820 So you've recreated, you've now put together the true 282 00:15:11,820 --> 00:15:15,080 memory with some false later information. 283 00:15:15,080 --> 00:15:17,760 And it's corrupted that true memory. 284 00:15:17,760 --> 00:15:22,990 Now, that's of obviously some interest in terms of getting 285 00:15:22,990 --> 00:15:25,480 testimony in court and things like that. 286 00:15:25,480 --> 00:15:28,350 The place where this has become extremely controversial 287 00:15:28,350 --> 00:15:37,530 and a real problem, is the issue, is it possible to 288 00:15:37,530 --> 00:15:42,380 recover memories that were suppressed? 289 00:15:42,380 --> 00:15:44,140 Suppose something really bad happens to you 290 00:15:44,140 --> 00:15:44,960 when you're a kid. 291 00:15:44,960 --> 00:15:49,760 Is it possible to suppress that memory and not have it 292 00:15:49,760 --> 00:15:51,720 available to you until some later point. 293 00:15:51,720 --> 00:15:57,000 At which point you now remember this memory. 294 00:15:57,000 --> 00:15:58,000 Is that possible? 295 00:15:58,000 --> 00:16:00,890 The evidence suggests that it may indeed be possible to not 296 00:16:00,890 --> 00:16:02,420 think about something for a very long time. 297 00:16:02,420 --> 00:16:05,030 Have it bubble up into your memory. 298 00:16:05,030 --> 00:16:09,350 Sadly, the evidence is also strong that it is possible for 299 00:16:09,350 --> 00:16:13,810 stuff to bubble up into your memory that isn't true. 300 00:16:13,810 --> 00:16:17,000 Under the right circumstances. 301 00:16:17,000 --> 00:16:21,600 If what's bubbling up into memory is memories of, for 302 00:16:21,600 --> 00:16:24,210 instance, childhood sexual abuse, for which there were a 303 00:16:24,210 --> 00:16:28,120 large number of criminal cases about a decade ago, now, it's 304 00:16:28,120 --> 00:16:31,350 a really important question to know whether 305 00:16:31,350 --> 00:16:33,210 these are real memories. 306 00:16:33,210 --> 00:16:38,800 And it is deeply unclear that there's any nice, neat way to 307 00:16:38,800 --> 00:16:39,810 know the answer. 308 00:16:39,810 --> 00:16:42,940 Now, look, obviously you can't, one of the reasons it's 309 00:16:42,940 --> 00:16:46,200 very unclear, is you can't do the real experiment. 310 00:16:46,200 --> 00:16:51,030 You can't take a little kid, abuse the little kid and say, 311 00:16:51,030 --> 00:16:52,530 let's come back 20 years and see if 312 00:16:52,530 --> 00:16:53,510 they recover the memory. 313 00:16:53,510 --> 00:16:57,490 I mean, it doesn't take a genius to figure out that 314 00:16:57,490 --> 00:16:58,720 that's not going to work. 315 00:16:58,720 --> 00:17:06,140 But, Liz Loftus has a very nice experiment that points in 316 00:17:06,140 --> 00:17:12,840 the direction of this sort of problem. 317 00:17:12,840 --> 00:17:18,290 Here's what she did: you go and interview families about, 318 00:17:18,290 --> 00:17:20,270 let's take one specific example. 319 00:17:20,270 --> 00:17:23,100 Getting lost in a mall. 320 00:17:23,100 --> 00:17:30,740 Did so-and-so, the designated subject. 321 00:17:30,740 --> 00:17:31,870 You are -- 322 00:17:31,870 --> 00:17:33,490 AUDIENCE: Olga. 323 00:17:33,490 --> 00:17:33,730 PROFESSOR: Olga. 324 00:17:33,730 --> 00:17:35,010 Did you ever get lost in the mall? 325 00:17:35,010 --> 00:17:36,140 AUDIENCE: [UNINTELLIGIBLE] 326 00:17:36,140 --> 00:17:36,580 PROFESSOR: OK, good. 327 00:17:36,580 --> 00:17:37,870 We won't ask her that. 328 00:17:37,870 --> 00:17:40,920 But we'll ask her family whether Olga ever 329 00:17:40,920 --> 00:17:42,750 got lost in the mall. 330 00:17:42,750 --> 00:17:44,300 And they'll say, no, no, no, it never happened. 331 00:17:44,300 --> 00:17:45,560 OK, good. 332 00:17:45,560 --> 00:17:46,550 Got a brother or sister? 333 00:17:46,550 --> 00:17:47,190 AUDIENCE: Yeah. 334 00:17:47,190 --> 00:17:47,410 PROFESSOR: Good. 335 00:17:47,410 --> 00:17:48,910 What have you got? 336 00:17:48,910 --> 00:17:49,210 AUDIENCE: Brother. 337 00:17:49,210 --> 00:17:50,580 PROFESSOR: Brother. 338 00:17:50,580 --> 00:17:54,030 So now we call her brother aside here, and say, we're 339 00:17:54,030 --> 00:17:56,100 going to do a little experiment here. 340 00:17:56,100 --> 00:18:00,660 Go ask Olga if she remembers the time that she 341 00:18:00,660 --> 00:18:03,160 got lost in the mall. 342 00:18:03,160 --> 00:18:06,430 And you do this with a lot of, Olga might look at her brother 343 00:18:06,430 --> 00:18:08,430 and say, no. 344 00:18:08,430 --> 00:18:11,260 And that's the end of that particular experiment. 345 00:18:11,260 --> 00:18:15,700 But some percentage of people, given brother who seems to 346 00:18:15,700 --> 00:18:19,100 think that this happened at some point, will, yeah, I 347 00:18:19,100 --> 00:18:19,980 remember that. 348 00:18:19,980 --> 00:18:20,760 Well, tell me about it. 349 00:18:20,760 --> 00:18:23,660 Well, you know how this story goes. 350 00:18:23,660 --> 00:18:26,060 Even if you were never lost in the mall, we could ask Olga. 351 00:18:26,060 --> 00:18:28,170 She could generate the story pretty successfully. 352 00:18:28,170 --> 00:18:30,710 It'll be something like, well, you know, I was there with my 353 00:18:30,710 --> 00:18:31,780 parents, right? 354 00:18:31,780 --> 00:18:35,660 And I was looking at these toys, like, in the window. 355 00:18:35,660 --> 00:18:38,610 And then I followed these, I was little, 356 00:18:38,610 --> 00:18:39,150 I was really little. 357 00:18:39,150 --> 00:18:40,400 So you don't get lost in the mall when 358 00:18:40,400 --> 00:18:41,720 you're 18 or something. 359 00:18:41,720 --> 00:18:44,010 So, I was little. 360 00:18:44,010 --> 00:18:45,530 I'm following these legs around. 361 00:18:45,530 --> 00:18:47,990 And it turns out to be the wrong legs. 362 00:18:47,990 --> 00:18:50,430 I thought it was my mom and it was somebody else. 363 00:18:50,430 --> 00:18:54,810 And I started to cry, and then this big man came and he took 364 00:18:54,810 --> 00:18:57,810 me to the security guys. 365 00:18:57,810 --> 00:19:02,400 And they announced it over the PA system, that I was lost. 366 00:19:02,400 --> 00:19:05,070 And my mother came back -- 367 00:19:05,070 --> 00:19:08,700 people fill in really good, detailed stories of this. 368 00:19:08,700 --> 00:19:13,810 That are apparently invented out of -- well, not out of 369 00:19:13,810 --> 00:19:15,600 really whole cloth, I suppose. 370 00:19:15,600 --> 00:19:20,010 They're invented out of the understanding that you have of 371 00:19:20,010 --> 00:19:23,560 the possibility of being lost in the mall. 372 00:19:23,560 --> 00:19:25,900 And the childhood fear of being lost in the mall. 373 00:19:25,900 --> 00:19:27,920 But it's not a real memory. 374 00:19:27,920 --> 00:19:29,120 Feels like a real memory. 375 00:19:29,120 --> 00:19:32,700 Subjects in Loftus's experiments, confronted later 376 00:19:32,700 --> 00:19:37,320 with the reality that this was all a setup, were often quite 377 00:19:37,320 --> 00:19:38,760 surprised -- 378 00:19:38,760 --> 00:19:41,200 Sure I was never lost in the mall? 379 00:19:41,200 --> 00:19:43,410 No, says Olga's brother. 380 00:19:43,410 --> 00:19:46,460 Liz Loftus just told me to tell you that. 381 00:19:46,460 --> 00:19:49,020 And, oh, ah, hm, that's a little weird. 382 00:19:49,020 --> 00:19:52,930 But, the point is it is possible to generate things 383 00:19:52,930 --> 00:19:57,460 that feel like memories that are not memories. 384 00:19:57,460 --> 00:19:58,480 Yes. 385 00:19:58,480 --> 00:19:58,991 AUDIENCE: What about 386 00:19:58,991 --> 00:20:04,606 remembering dreams, supposing something happened in a dream 387 00:20:04,606 --> 00:20:07,159 that could happen, and you remember that as a [INAUDIBLE] 388 00:20:09,700 --> 00:20:11,850 PROFESSOR: If you get routinely confused about what 389 00:20:11,850 --> 00:20:16,460 happens in your dreams and what happens in reality, 390 00:20:16,460 --> 00:20:20,880 that's known in the trade as a failure of reality testing. 391 00:20:20,880 --> 00:20:25,440 And is not considered a good psychiatric sign. 392 00:20:25,440 --> 00:20:30,950 But, sure, those sorts of things are possible. 393 00:20:30,950 --> 00:20:36,090 Particularly when you end up having desperately realistic 394 00:20:36,090 --> 00:20:38,880 kinds of dreams, the sorts of things where these, sort of, 395 00:20:38,880 --> 00:20:40,700 reality testing things do. 396 00:20:40,700 --> 00:20:44,410 Some dirt is, how many people, I'll probably ask this again 397 00:20:44,410 --> 00:20:45,730 when we talk about sleep and dreams. 398 00:20:45,730 --> 00:20:46,670 But, never mind. 399 00:20:46,670 --> 00:20:49,330 How many people have had the dream of their 400 00:20:49,330 --> 00:20:51,760 alarm clock going off? 401 00:20:51,760 --> 00:20:55,650 Dreamed that they turned off their alarm clock and got up. 402 00:20:55,650 --> 00:21:02,310 Only to then discover, oh, man, I'm in bed. 403 00:21:02,310 --> 00:21:05,580 And I'm supposed to be in my psych recitation or 404 00:21:05,580 --> 00:21:06,970 something like that. 405 00:21:06,970 --> 00:21:11,150 Or, at least tried to give that a story to their TA. 406 00:21:11,150 --> 00:21:15,370 So, yeah, there are possible sources of 407 00:21:15,370 --> 00:21:18,350 confusion like that. 408 00:21:18,350 --> 00:21:25,300 The last item on this list of sources of possible confusion 409 00:21:25,300 --> 00:21:27,860 I think, says something like becoming plastic again. 410 00:21:27,860 --> 00:21:30,740 This is new stuff which I will come to momentarily. 411 00:21:30,740 --> 00:21:32,170 AUDIENCE: [UNINTELLIGIBLE] 412 00:21:32,170 --> 00:21:33,160 PROFESSOR: That's OK. 413 00:21:33,160 --> 00:21:36,072 AUDIENCE: Is there a way to distinguish between these 414 00:21:36,072 --> 00:21:39,010 false memories or if they've actually occurred? 415 00:21:39,010 --> 00:21:43,120 [UNINTELLIGIBLE] 416 00:21:43,120 --> 00:21:45,636 If you don't know them [UNINTELLIGIBLE] 417 00:21:45,636 --> 00:21:46,813 comes and says, I think I was sexually abused as a child, is 418 00:21:46,813 --> 00:21:48,460 there a way to determine, is this, like -- 419 00:21:51,960 --> 00:21:55,930 PROFESSOR: There is, at the present, time no ironclad way 420 00:21:55,930 --> 00:21:57,500 to know that. 421 00:21:57,500 --> 00:22:02,080 I mean, sometimes you can go and get evidence -- there's no 422 00:22:02,080 --> 00:22:04,970 ironclad way to know simply from the memory testimony. 423 00:22:04,970 --> 00:22:08,060 There may be ways to find out by talking to other witnesses 424 00:22:08,060 --> 00:22:09,340 or something like that. 425 00:22:09,340 --> 00:22:12,640 There have been various efforts to look, for instance, 426 00:22:12,640 --> 00:22:17,110 at brain activity and see whether or not there's a 427 00:22:17,110 --> 00:22:18,850 difference in the pattern. 428 00:22:18,850 --> 00:22:20,550 So, you do this at the lab. 429 00:22:20,550 --> 00:22:24,260 Create a false memory and a real memory, and see if you 430 00:22:24,260 --> 00:22:29,140 can see the difference somehow in a FMRI scan or 431 00:22:29,140 --> 00:22:31,110 something of that sort. 432 00:22:31,110 --> 00:22:33,720 There are hints that maybe you can get there. 433 00:22:33,720 --> 00:22:35,860 It's like lie detectors. 434 00:22:35,860 --> 00:22:37,710 A fascinating topic for a whole lecture 435 00:22:37,710 --> 00:22:39,320 that I'm not giving. 436 00:22:39,320 --> 00:22:45,830 There is no ironclad way of telling a lie from the truth 437 00:22:45,830 --> 00:22:47,590 by wiring anybody up. 438 00:22:47,590 --> 00:22:51,290 Regardless of what the FBI tells you. 439 00:22:51,290 --> 00:22:53,360 Actually, I shouldn't be telling you this, because the 440 00:22:53,360 --> 00:22:55,780 only reason that these things work is because 441 00:22:55,780 --> 00:22:57,240 people think they work. 442 00:22:57,240 --> 00:22:59,630 If you think that the lie detector's really going to 443 00:22:59,630 --> 00:23:03,100 work, then if you're lying, you tend to get kind of more 444 00:23:03,100 --> 00:23:05,800 nervous than if you don't think, than if 445 00:23:05,800 --> 00:23:06,820 you're telling the truth. 446 00:23:06,820 --> 00:23:08,580 And that's what they can monitor. 447 00:23:08,580 --> 00:23:13,290 But a really good pathological liar, or a well-trained agent 448 00:23:13,290 --> 00:23:16,220 is apparently pretty good at getting past 449 00:23:16,220 --> 00:23:18,660 your average lie detector. 450 00:23:18,660 --> 00:23:21,860 Lots of people, you can get serious defense money by 451 00:23:21,860 --> 00:23:24,130 having a brilliant new idea about how to tell truth from 452 00:23:24,130 --> 00:23:27,160 lie and real memory from false memory. 453 00:23:27,160 --> 00:23:29,130 There's a lot of interest in doing that. 454 00:23:29,130 --> 00:23:35,760 But, no ironclad way of doing it at the present. 455 00:23:35,760 --> 00:23:39,660 Makes for great sci-fi novels, though, right what's going to 456 00:23:39,660 --> 00:23:43,250 happen when we finally figure out that we can actually stick 457 00:23:43,250 --> 00:23:46,150 this thing on your head and tell what your memories, 458 00:23:46,150 --> 00:23:49,180 whether your memories are true or not. 459 00:23:49,180 --> 00:23:52,810 But at the moment it's still sci-fi. 460 00:23:52,810 --> 00:23:59,000 Not quite sci-fi, but still very new, is the indication. 461 00:23:59,000 --> 00:24:01,350 Remember, I talked about consolidation last time. 462 00:24:01,350 --> 00:24:04,730 This process of making a fragile memory into something 463 00:24:04,730 --> 00:24:06,950 that will last for years and years. 464 00:24:06,950 --> 00:24:09,620 There is evidence at least from rats in a fairly 465 00:24:09,620 --> 00:24:15,820 restricted range of studies, that when you recall a memory, 466 00:24:15,820 --> 00:24:19,630 when you bring it back out of, say, long-term memory into 467 00:24:19,630 --> 00:24:23,290 that working memory, that it becomes plastic again. 468 00:24:23,290 --> 00:24:26,510 And that perhaps some of the distortions in memory can 469 00:24:26,510 --> 00:24:28,100 arise at that point. 470 00:24:28,100 --> 00:24:29,940 Because the memory is again plastic. 471 00:24:29,940 --> 00:24:32,720 It needs to be reconsolidated in order to 472 00:24:32,720 --> 00:24:35,770 be, in effect, restored. 473 00:24:35,770 --> 00:24:38,780 So you bring it out. 474 00:24:38,780 --> 00:24:43,730 It's now vulnerable again in ways that it wasn't before you 475 00:24:43,730 --> 00:24:44,480 remembered it. 476 00:24:44,480 --> 00:24:49,230 My guess is that if you come back to the course ten years 477 00:24:49,230 --> 00:24:52,590 from now, that there'll be much more to 478 00:24:52,590 --> 00:24:53,540 be said about that. 479 00:24:53,540 --> 00:24:58,490 That's new and interesting. 480 00:24:58,490 --> 00:25:00,680 But the general point should be clear enough. 481 00:25:00,680 --> 00:25:05,000 Which is that recall out of memory, retrieval out of 482 00:25:05,000 --> 00:25:07,270 memory, is an active reconstruction. 483 00:25:07,270 --> 00:25:10,580 It has multiple pitfalls in it that might cause what you are 484 00:25:10,580 --> 00:25:15,120 pulling out as a memory to be distorted in 485 00:25:15,120 --> 00:25:20,120 some fashion or other. 486 00:25:20,120 --> 00:25:24,020 Now, the second part, the larger part, of what I want to 487 00:25:24,020 --> 00:25:30,500 talk about today is the sense in which we are not simple 488 00:25:30,500 --> 00:25:35,150 logic engines in the way that our computers, most 489 00:25:35,150 --> 00:25:36,380 of the time, are. 490 00:25:36,380 --> 00:25:42,880 Now, one way to think about that is to think about a 491 00:25:42,880 --> 00:25:47,400 distinction between what's called narrative thought and 492 00:25:47,400 --> 00:25:50,290 propositional thought. 493 00:25:50,290 --> 00:25:53,110 Propositional thought, and this runs parallel. 494 00:25:53,110 --> 00:25:55,510 Narrative thought and episodic memory. 495 00:25:55,510 --> 00:25:57,780 Propositional thought and semantic memory. 496 00:25:57,780 --> 00:25:59,640 They travel together. 497 00:25:59,640 --> 00:26:07,100 So, propositional thought is the sort of thought that your 498 00:26:07,100 --> 00:26:09,550 computer would be good at. 499 00:26:09,550 --> 00:26:17,530 Moving symbols around, is x bigger than y, and problem set 500 00:26:17,530 --> 00:26:20,900 kind of thinking is mostly propositional thought. 501 00:26:20,900 --> 00:26:25,040 Narrative thought is thought that has a 502 00:26:25,040 --> 00:26:28,580 story-like aspect to it. 503 00:26:28,580 --> 00:26:31,740 Perhaps an episodic memory type aspect to it. 504 00:26:31,740 --> 00:26:41,440 So, an example of narrative thought might be do you want 505 00:26:41,440 --> 00:26:49,090 to go to the party that is happening on Saturday night at 506 00:26:49,090 --> 00:26:52,120 Alpha Beta Gamma house, or something like that. 507 00:26:52,120 --> 00:26:53,420 How do you figure that out? 508 00:26:53,420 --> 00:26:55,950 Well you probably don't do some symbolic 509 00:26:55,950 --> 00:26:57,620 cost-benefit analysis. 510 00:26:57,620 --> 00:27:02,440 You probably imagine yourself there. 511 00:27:02,440 --> 00:27:04,080 Would I be having fun? 512 00:27:04,080 --> 00:27:06,460 Who's going to be there. 513 00:27:06,460 --> 00:27:07,640 You'd run a scenario. 514 00:27:07,640 --> 00:27:10,300 And that's the sort of narrative thought. 515 00:27:10,300 --> 00:27:14,860 Turns out that narrative thought has a power in our own 516 00:27:14,860 --> 00:27:21,320 decision-making that is capable of running across, 517 00:27:21,320 --> 00:27:24,560 running over, the simple facts of the situation. 518 00:27:24,560 --> 00:27:28,790 Even when we know the facts correctly. 519 00:27:28,790 --> 00:27:32,350 Risk assessment is one example of this. 520 00:27:32,350 --> 00:27:33,910 So, forced choice. 521 00:27:33,910 --> 00:27:35,040 You gotta pick one of the other. 522 00:27:35,040 --> 00:27:38,640 Don't just sit on your hands here. 523 00:27:38,640 --> 00:27:44,390 Which makes you more nervous: taking a ride in the car or 524 00:27:44,390 --> 00:27:45,370 flying in an airplane? 525 00:27:45,370 --> 00:27:47,810 How many vote for riding in the car? 526 00:27:47,810 --> 00:27:51,580 How many vote for flying an airplane. 527 00:27:51,580 --> 00:27:55,470 As usual half the people can't figure out that forced choice, 528 00:27:55,470 --> 00:27:57,650 pick one or the other, means you have to raise your hand 529 00:27:57,650 --> 00:28:00,140 for one of these two options. 530 00:28:00,140 --> 00:28:02,730 At least if you're moderately alert. 531 00:28:02,730 --> 00:28:06,140 But anyway, plane wins big-time. 532 00:28:06,140 --> 00:28:10,760 How many people, let's just ask the propositional thought, 533 00:28:10,760 --> 00:28:13,200 or the semantic memory nugget. 534 00:28:13,200 --> 00:28:17,120 Which is more dangerous? 535 00:28:17,120 --> 00:28:20,650 Flying in a car, that's only -- 536 00:28:24,350 --> 00:28:27,170 that's the Harry Potter version. 537 00:28:27,170 --> 00:28:29,010 Flying a plane or driving a car. 538 00:28:29,010 --> 00:28:31,140 How many vote for plane? 539 00:28:31,140 --> 00:28:32,830 How many vote for car? 540 00:28:32,830 --> 00:28:36,950 Everybody knows the proposition here. 541 00:28:36,950 --> 00:28:41,500 It's riskier to, you can measure this in all sorts of 542 00:28:41,500 --> 00:28:42,710 different ways. 543 00:28:42,710 --> 00:28:44,780 Per mile, per time. 544 00:28:44,780 --> 00:28:47,570 It's it's riskier traveling in a car than 545 00:28:47,570 --> 00:28:51,880 traveling in a plane. 546 00:28:51,880 --> 00:28:53,880 But people are much more nervous about planes. 547 00:28:53,880 --> 00:28:54,700 Why is that? 548 00:28:54,700 --> 00:28:55,650 There are multiple reasons. 549 00:28:55,650 --> 00:29:00,790 But perhaps one of the driving ones is that the narrative 550 00:29:00,790 --> 00:29:05,220 surrounding planes, and particularly surrounding plane 551 00:29:05,220 --> 00:29:08,940 crashes is sufficiently vivid that it overrides your 552 00:29:08,940 --> 00:29:12,870 knowledge of the mere facts of the matter. 553 00:29:12,870 --> 00:29:17,590 If and when a plane sadly goes down, that's big news. 554 00:29:17,590 --> 00:29:19,630 A lot of people tend to get killed. 555 00:29:19,630 --> 00:29:23,920 It makes big news in the papers and on TV. 556 00:29:23,920 --> 00:29:25,360 Highly salient. 557 00:29:25,360 --> 00:29:32,760 The fact that there's a steady drumbeat of car crashes, 558 00:29:32,760 --> 00:29:36,740 unless it happens to be very local to you, doesn't have any 559 00:29:36,740 --> 00:29:38,270 particular impact on you. 560 00:29:38,270 --> 00:29:41,490 And the raw statistic doesn't much matter. 561 00:29:41,490 --> 00:29:44,640 In the same way, this turns out to be true in course 562 00:29:44,640 --> 00:29:47,890 catalog land. 563 00:29:47,890 --> 00:29:52,390 So that the MIT course evaluation guide, when it 564 00:29:52,390 --> 00:29:58,370 comes out, has vast piles of cool statistics. 565 00:29:58,370 --> 00:30:00,060 What people thought of this particular course. 566 00:30:00,060 --> 00:30:00,650 And then it tends -- 567 00:30:00,650 --> 00:30:02,170 I haven't seen the most recent one. 568 00:30:02,170 --> 00:30:06,550 But typically it has a paragraph of comments. 569 00:30:06,550 --> 00:30:10,000 One of my favorites over the years, so-and-so is to the 570 00:30:10,000 --> 00:30:16,990 teaching of physics as Einstein is to the National 571 00:30:16,990 --> 00:30:18,240 Football League. 572 00:30:21,120 --> 00:30:23,360 That's a very salient bit of narrative. 573 00:30:23,360 --> 00:30:26,510 And it turns out that those sort of comments have much 574 00:30:26,510 --> 00:30:28,870 more impact than all the statistics. 575 00:30:28,870 --> 00:30:33,430 In spite of the fact that the nice rational you knows that 576 00:30:33,430 --> 00:30:37,550 one person's brilliant catty comment, or brilliantly 577 00:30:37,550 --> 00:30:40,910 positive comment, is presumably much less valuable 578 00:30:40,910 --> 00:30:43,860 than the statistical average across 300 students. 579 00:30:43,860 --> 00:30:46,170 But the narrative, again, overwhelms. 580 00:30:46,170 --> 00:30:51,800 You can watch this on Friday, when you watch the debate. 581 00:30:51,800 --> 00:30:59,510 You can be guaranteed that somebody is going to, it'll 582 00:30:59,510 --> 00:31:01,880 probably be either Kerry or Bush, actually. 583 00:31:01,880 --> 00:31:09,990 That Kerry or Bush will take some boring statistics and 584 00:31:09,990 --> 00:31:11,610 personalize it. 585 00:31:11,610 --> 00:31:12,850 They do this all the time. 586 00:31:12,850 --> 00:31:17,040 It's become a sort of it a trope, a standard item in the 587 00:31:17,040 --> 00:31:20,730 State of the Union message where the President always put 588 00:31:20,730 --> 00:31:22,480 a couple of heroes. 589 00:31:22,480 --> 00:31:24,930 They probably are, but, you know, heroes up in the balcony 590 00:31:24,930 --> 00:31:27,860 there so that he can point to them and say, 591 00:31:27,860 --> 00:31:28,770 look at this guy. 592 00:31:28,770 --> 00:31:29,420 Look at that guy. 593 00:31:29,420 --> 00:31:35,630 And in the debate it'll be, there'll be some discussion of 594 00:31:35,630 --> 00:31:37,360 employment statistics. 595 00:31:37,360 --> 00:31:40,350 And presumably this would be Kerry. 596 00:31:40,350 --> 00:31:42,240 Because he'll be beating on Bush about it. 597 00:31:42,240 --> 00:31:44,960 I don't care about what you say about the economy 598 00:31:44,960 --> 00:31:46,250 recovering or whatever. 599 00:31:46,250 --> 00:31:50,450 What I care about is poor little Olga. 600 00:31:50,450 --> 00:31:54,270 She got lost in the mall, and then her father lost her job 601 00:31:54,270 --> 00:31:58,680 as a mall security guy because of the decisions that your 602 00:31:58,680 --> 00:32:01,770 government, your administration made. 603 00:32:01,770 --> 00:32:05,580 And what am I going to say to poor little Olga. 604 00:32:05,580 --> 00:32:08,820 Why? 605 00:32:08,820 --> 00:32:11,430 It's very sad about poor little Olga, who's now been 606 00:32:11,430 --> 00:32:14,340 lost in the mall and she's never going to sit in the 607 00:32:14,340 --> 00:32:16,520 front again. 608 00:32:16,520 --> 00:32:22,500 But there are, like, 260 million people, whatever it is 609 00:32:22,500 --> 00:32:23,660 in the country at the moment. 610 00:32:23,660 --> 00:32:27,030 It's very sad about poor little whoever it is. 611 00:32:27,030 --> 00:32:30,530 What's really important are the broad economic statistics. 612 00:32:30,530 --> 00:32:35,030 Broad economic statistics are wonderful propositional 613 00:32:35,030 --> 00:32:36,590 reasoning kinds of stuff. 614 00:32:36,590 --> 00:32:38,510 But are really boring. 615 00:32:38,510 --> 00:32:40,000 Unless you're an economist. 616 00:32:40,000 --> 00:32:42,690 But poor little Olga is really riveting. 617 00:32:42,690 --> 00:32:44,390 And poor little Olga is what you'll hear 618 00:32:44,390 --> 00:32:47,800 about in the debates. 619 00:32:47,800 --> 00:32:52,240 We should do a debriefing about this on Monday. 620 00:32:52,240 --> 00:32:54,340 If anybody listens to the debate and hears any of these 621 00:32:54,340 --> 00:32:56,540 little tidbits, send me email. 622 00:32:59,620 --> 00:33:01,500 In case I miss it. 623 00:33:01,500 --> 00:33:03,810 And then we can we about this. 624 00:33:03,810 --> 00:33:07,680 Now, this can get studied in the lab. 625 00:33:07,680 --> 00:33:08,430 Here, take a look. 626 00:33:08,430 --> 00:33:12,720 On the handout, there's a thing called the framing demo. 627 00:33:12,720 --> 00:33:15,410 Says that there's an unusual flu coming. 628 00:33:15,410 --> 00:33:18,880 It's expected to kill 600 people in the U.S., and there 629 00:33:18,880 --> 00:33:20,190 are two things you can do. 630 00:33:20,190 --> 00:33:24,740 There's Program A and Program B. Take a quick look, decide 631 00:33:24,740 --> 00:33:29,200 on Program A or Program B, and then we'll collect a little 632 00:33:29,200 --> 00:33:31,040 data and talk about it here. 633 00:33:37,190 --> 00:33:41,770 Don't cheat off your neighbor's test. 634 00:33:41,770 --> 00:33:42,230 OK. 635 00:33:42,230 --> 00:33:46,060 Everybody manage to come up with an answer here? 636 00:33:46,060 --> 00:33:46,910 OK. 637 00:33:46,910 --> 00:33:49,340 So, let's see for starters. 638 00:33:49,340 --> 00:33:51,470 How many people went for A? 639 00:33:54,210 --> 00:33:55,820 How many people went for B? 640 00:33:58,460 --> 00:34:01,600 How many people don't know what they're going for? 641 00:34:01,600 --> 00:34:03,380 OK, well it looks about evenly divided. 642 00:34:03,380 --> 00:34:04,330 Boy, that's really boring. 643 00:34:04,330 --> 00:34:07,270 Well, it would be really boring, except that there are 644 00:34:07,270 --> 00:34:11,620 two versions of the handout out there. 645 00:34:11,620 --> 00:34:19,040 So, Version One of the handout says that if Program A is 646 00:34:19,040 --> 00:34:22,440 adopted, uh-oh, I don't know which Version One says. 647 00:34:22,440 --> 00:34:25,680 Well, it doesn't matter which I call Version One, does it? 648 00:34:25,680 --> 00:34:31,120 Version One says, 400 people will die. 649 00:34:31,120 --> 00:34:39,030 Version Two says 200 people will be saved. 650 00:34:41,770 --> 00:34:43,870 Did I get the language right there? 651 00:34:43,870 --> 00:34:47,000 What you will notice it, if the net is 600 here, these are 652 00:34:47,000 --> 00:34:48,560 identical, right? 653 00:34:48,560 --> 00:34:51,530 There's no difference except in the way that it's phrased. 654 00:34:51,530 --> 00:34:56,780 Or the way the question is framed. 655 00:34:56,780 --> 00:35:01,140 There's a similar change to the second option, 656 00:35:01,140 --> 00:35:03,270 B in the die -- 657 00:35:03,270 --> 00:35:07,360 in this person, it's described as a 1/3 chance that no-one 658 00:35:07,360 --> 00:35:12,390 will die and a 2/3 chance that 600 hundred people will die. 659 00:35:12,390 --> 00:35:14,960 Do you know about the notion of expected value? 660 00:35:14,960 --> 00:35:18,910 You can calculate the expected value of that option, which 661 00:35:18,910 --> 00:35:34,540 will be 1/3 chance that no-one will die and a 2/3 chance that 662 00:35:34,540 --> 00:35:38,420 600 people will die. 663 00:35:38,420 --> 00:35:42,750 And then that's going to be 400 people dying, again, here. 664 00:35:42,750 --> 00:35:48,010 So all the options are mathematically identical. 665 00:35:48,010 --> 00:35:52,900 Now, how many people have Version One, the 400 hundred 666 00:35:52,900 --> 00:35:55,310 people will die thing? 667 00:35:55,310 --> 00:35:58,680 OK, how many people have the 200 people will die thing? 668 00:35:58,680 --> 00:35:59,660 OK, good. 669 00:35:59,660 --> 00:36:02,020 So, it's roughly evenly divided. 670 00:36:02,020 --> 00:36:05,050 Now, what I want to do is, I want to get the A and B for 671 00:36:05,050 --> 00:36:06,700 these two different versions. 672 00:36:06,700 --> 00:36:10,170 If you have the 400 people will die version, how many 673 00:36:10,170 --> 00:36:14,300 people opted for Option A? 674 00:36:14,300 --> 00:36:17,370 How many people opted for Option B? 675 00:36:17,370 --> 00:36:22,510 OK, so B beats A, and actually it beat A big time here. 676 00:36:22,510 --> 00:36:26,210 If you have the 200 people will be saved version, how 677 00:36:26,210 --> 00:36:29,030 many people went for A? 678 00:36:29,030 --> 00:36:31,260 How many people went for B? 679 00:36:31,260 --> 00:36:35,990 OK, in that case A beats B not quite as massively. 680 00:36:35,990 --> 00:36:40,710 But, clearly this manipulation has flipped the answer. 681 00:36:40,710 --> 00:36:44,260 It ain't math that's doing it, because the math is the same. 682 00:36:44,260 --> 00:36:48,750 What's changing here is what Kahneman and Tversky called 683 00:36:48,750 --> 00:36:51,270 the framing of the problem. 684 00:36:51,270 --> 00:36:53,650 But really it's the narrative around the problem. 685 00:36:53,650 --> 00:36:59,280 What happens is, you read Option A. 400 people will die. 686 00:36:59,280 --> 00:37:01,100 That's terrible. 687 00:37:01,100 --> 00:37:04,665 Well, the second possibility in this case, the B is, woah, 688 00:37:04,665 --> 00:37:07,050 there's something we could do. 689 00:37:07,050 --> 00:37:11,240 That probabilistically might save some of those people. 690 00:37:11,240 --> 00:37:12,090 That sounds OK. 691 00:37:12,090 --> 00:37:13,730 I'll opt for this. 692 00:37:13,730 --> 00:37:18,240 As opposed to this one, 200 people will be saved. 693 00:37:18,240 --> 00:37:20,430 Well, that's better than nothing. 694 00:37:20,430 --> 00:37:23,580 And the alternative, B, here. 695 00:37:23,580 --> 00:37:27,690 You read the part that says, everybody might die. 696 00:37:27,690 --> 00:37:30,240 Hm, doesn't sound too good. 697 00:37:30,240 --> 00:37:32,080 And so you flip it. 698 00:37:32,080 --> 00:37:34,980 So what you're doing here is the way that you ask the 699 00:37:34,980 --> 00:37:38,400 question changes the answer you get. 700 00:37:38,400 --> 00:37:42,370 Well, we can go back to the politics game here just fine. 701 00:37:42,370 --> 00:37:44,700 Pollsters do this all the time. 702 00:37:44,700 --> 00:37:47,990 The neutral pollsters desperately try not to do it. 703 00:37:47,990 --> 00:37:50,270 But campaign polls, when they're trying to get the 704 00:37:50,270 --> 00:37:57,590 correct answer out, who would you rather have for President, 705 00:37:57,590 --> 00:38:03,855 a strong leader who has ruled the free world with a firm 706 00:38:03,855 --> 00:38:08,890 hand for four years, or a flip-flopping senator from a 707 00:38:08,890 --> 00:38:11,650 liberal East Coast state? 708 00:38:11,650 --> 00:38:14,620 Or you could ask the question, who would you like to have as 709 00:38:14,620 --> 00:38:19,220 your President for the next four years, a guy who barely 710 00:38:19,220 --> 00:38:21,440 made it out of Yale and has driven the country into a 711 00:38:21,440 --> 00:38:26,140 ditch, or a guy who actually got decent grades at Yale and 712 00:38:26,140 --> 00:38:30,100 is a national war hero and has really good hair? 713 00:38:34,040 --> 00:38:41,160 You can move the question around, by the way, and not 714 00:38:41,160 --> 00:38:42,260 just in polling. 715 00:38:42,260 --> 00:38:44,960 But the way you frame issues. 716 00:38:44,960 --> 00:38:47,650 I mean, talk about framing issues. 717 00:38:47,650 --> 00:38:55,650 If you frame the war in Iraq as part of the War on Terror, 718 00:38:55,650 --> 00:38:58,490 and an integral part of the War on Terror asking whether 719 00:38:58,490 --> 00:39:01,370 you want to keep doing that, is different than taking the 720 00:39:01,370 --> 00:39:07,400 same thing and calling it a distraction from the War on 721 00:39:07,400 --> 00:39:09,530 Terror, for example. 722 00:39:09,530 --> 00:39:12,250 The facts remain sort of similar. 723 00:39:12,250 --> 00:39:14,640 It's harder to do the experiment in a nice, clean, 724 00:39:14,640 --> 00:39:15,690 controlled -- 725 00:39:15,690 --> 00:39:18,360 here, we can make all the math come out right. 726 00:39:18,360 --> 00:39:21,460 In a political debate, there's an argument about the actual 727 00:39:21,460 --> 00:39:22,710 facts, too, of course. 728 00:39:22,710 --> 00:39:25,825 But the way that you present the facts, the way you frame 729 00:39:25,825 --> 00:39:30,820 the facts, is designed to get the answer that you want out 730 00:39:30,820 --> 00:39:33,440 of those facts. 731 00:39:33,440 --> 00:39:36,140 There's a beautiful round of this in today's paper, because 732 00:39:36,140 --> 00:39:39,540 there's the report of the whatever they're called, Iraq 733 00:39:39,540 --> 00:39:42,620 Commission that was looking for weapons and didn't find 734 00:39:42,620 --> 00:39:43,400 any weapons. 735 00:39:43,400 --> 00:39:47,470 And if you read the Kerry camp statements about what that 736 00:39:47,470 --> 00:39:50,990 means, and what the Bush camp says it means. 737 00:39:50,990 --> 00:39:54,020 The Bush camp is very big on intention this morning. 738 00:39:54,020 --> 00:39:59,600 He really, the report that Saddam wanted weapons. 739 00:39:59,600 --> 00:40:03,970 And so the Bush camp is busy saying, he wanted weapons and 740 00:40:03,970 --> 00:40:05,470 he would have gotten them as soon as he could. 741 00:40:05,470 --> 00:40:10,240 And the Kerry camp is looking at the same facts and saying, 742 00:40:10,240 --> 00:40:11,180 we all want stuff. 743 00:40:11,180 --> 00:40:14,180 Man, he didn't have anything. 744 00:40:14,180 --> 00:40:14,520 So. 745 00:40:14,520 --> 00:40:16,970 Anyway, you can have lots of fun Friday. 746 00:40:16,970 --> 00:40:19,680 You can watch them do this one, too. 747 00:40:19,680 --> 00:40:21,380 All right, so. 748 00:40:21,380 --> 00:40:25,340 The narrative that you put around a story. 749 00:40:25,340 --> 00:40:27,380 The narrative that you put around the story does not 750 00:40:27,380 --> 00:40:29,220 influence your computer. 751 00:40:29,220 --> 00:40:34,820 You can tell your computer about 1/3 times 600 plus 2/3 752 00:40:34,820 --> 00:40:37,430 times minus 600, any way you like. 753 00:40:37,430 --> 00:40:40,650 You can use big font, you can use little font, you can use 754 00:40:40,650 --> 00:40:43,010 you want, it's going to come out with the same answer. 755 00:40:43,010 --> 00:40:45,960 Not true of the way that you think about these things. 756 00:40:45,960 --> 00:40:51,420 You're going to get different answers depending on how you 757 00:40:51,420 --> 00:40:53,440 frame the question. 758 00:40:53,440 --> 00:40:56,850 Let me do a different example. 759 00:40:56,850 --> 00:40:58,840 That's not my handout, I got to see what's actually on the 760 00:40:58,840 --> 00:41:00,870 handout here. 761 00:41:00,870 --> 00:41:03,800 So, right underneath that, underneath the bit about 762 00:41:03,800 --> 00:41:05,800 framing, there's another demo. 763 00:41:05,800 --> 00:41:10,240 I will, in advance, promise you that this is the same on 764 00:41:10,240 --> 00:41:12,410 all the handouts. 765 00:41:12,410 --> 00:41:14,980 Because otherwise you're going to sit there trying to figure 766 00:41:14,980 --> 00:41:16,180 out what's the weird thing. 767 00:41:16,180 --> 00:41:18,570 So, this is the same on all the handouts. 768 00:41:18,570 --> 00:41:22,580 I'm also aware that I have six slots between most probable 769 00:41:22,580 --> 00:41:23,620 and least probable. 770 00:41:23,620 --> 00:41:27,040 And only five options to go in there. 771 00:41:27,040 --> 00:41:29,510 Deal with it. 772 00:41:29,510 --> 00:41:31,040 Here's what you've got to do. 773 00:41:31,040 --> 00:41:32,340 You've got this guy Henry. 774 00:41:32,340 --> 00:41:34,080 Henry is a short, slim man. 775 00:41:34,080 --> 00:41:35,240 He likes to read poetry. 776 00:41:35,240 --> 00:41:36,630 He's been active in environmental 777 00:41:36,630 --> 00:41:37,960 and feminist causes. 778 00:41:37,960 --> 00:41:41,230 And Henry is a what? 779 00:41:41,230 --> 00:41:44,920 And so there are these five options here. 780 00:41:44,920 --> 00:41:48,280 What you want to do is rank order them from most likely to 781 00:41:48,280 --> 00:41:49,530 least likely. 782 00:41:53,880 --> 00:41:56,150 And then we'll talk about that a bit. 783 00:41:56,150 --> 00:42:00,110 Let's see how that -- 784 00:42:03,140 --> 00:42:07,220 No, you can't copy off your neighbor. 785 00:42:07,220 --> 00:42:08,470 You know Henry. 786 00:42:16,140 --> 00:42:17,850 I should have brought Henry in. 787 00:42:17,850 --> 00:42:21,210 Just grabbed some character and said, this is Henry. 788 00:42:21,210 --> 00:42:21,730 All right. 789 00:42:21,730 --> 00:42:24,080 How many people need more time for this? 790 00:42:24,080 --> 00:42:26,450 OK, a couple of people still working on it. 791 00:42:35,480 --> 00:42:36,210 All right. 792 00:42:36,210 --> 00:42:42,250 Well, we'll just go up. 793 00:42:42,250 --> 00:42:44,410 Let's look at a couple of particular 794 00:42:44,410 --> 00:42:47,440 comparisons on here. 795 00:42:50,370 --> 00:42:56,440 How many people put down, let's look at A and B. Whoops. 796 00:42:56,440 --> 00:43:00,310 That looks propositional thought. 797 00:43:04,920 --> 00:43:06,470 All gone. 798 00:43:06,470 --> 00:43:09,540 OK, let's look at A and B, for example. 799 00:43:09,540 --> 00:43:14,420 How many people said A was more probable than B? 800 00:43:14,420 --> 00:43:17,550 How many people said B was more probable than A? 801 00:43:17,550 --> 00:43:18,060 All right. 802 00:43:18,060 --> 00:43:19,840 B wins. 803 00:43:19,840 --> 00:43:28,630 And if I've got the right one here, so ivy beats truck 804 00:43:28,630 --> 00:43:31,550 big-time, right? 805 00:43:31,550 --> 00:43:33,180 All right. 806 00:43:33,180 --> 00:43:38,130 How many truck drivers are there in the country? 807 00:43:38,130 --> 00:43:38,620 AUDIENCE: A lot. 808 00:43:38,620 --> 00:43:39,380 PROFESSOR: A lot. 809 00:43:39,380 --> 00:43:41,130 10 to the -- 810 00:43:41,130 --> 00:43:44,170 I don't know the answer to this, but? 811 00:43:44,170 --> 00:43:45,760 5? 812 00:43:45,760 --> 00:43:48,200 4 is only a thousand. 813 00:43:48,200 --> 00:43:48,960 AUDIENCE: 5. 814 00:43:48,960 --> 00:43:53,260 PROFESSOR: 5 is probably-- it may be 6, but let's go with 5. 815 00:43:53,260 --> 00:43:55,250 OK. 816 00:43:55,250 --> 00:43:58,030 How many Ivy League classics professors are 817 00:43:58,030 --> 00:43:59,280 there in the country? 818 00:44:04,900 --> 00:44:07,120 How many Ivy League schools are there? 819 00:44:07,120 --> 00:44:07,740 AUDIENCE: Eight. 820 00:44:07,740 --> 00:44:09,670 PROFESSOR: Eight to ten, something like that. 821 00:44:09,670 --> 00:44:11,840 Maybe, if you're lucky, there's ten classics 822 00:44:11,840 --> 00:44:13,110 professors at each. 823 00:44:13,110 --> 00:44:16,320 So that's going to get you to 10 to the 2nd. 824 00:44:16,320 --> 00:44:21,630 Is it a thousand times more likely that some random guy 825 00:44:21,630 --> 00:44:26,290 that you meet, who happens to be -- well, let's phrase this 826 00:44:26,290 --> 00:44:26,890 a different way. 827 00:44:26,890 --> 00:44:30,790 If you took all, whatever this is, a hundred thousand truck 828 00:44:30,790 --> 00:44:31,800 drivers in the country. 829 00:44:31,800 --> 00:44:32,760 I'm sure there are more than that. 830 00:44:32,760 --> 00:44:34,490 But let's just suppose there are only a hundred thousand 831 00:44:34,490 --> 00:44:36,710 truck drivers in the country. 832 00:44:36,710 --> 00:44:42,940 Are there fewer than a hundred of them who might be, for 833 00:44:42,940 --> 00:44:46,330 instance, short, poetry-loving feminists? 834 00:44:49,780 --> 00:44:53,090 And that assumes that every one of the Ivy League 835 00:44:53,090 --> 00:44:56,480 professors is a short -- 836 00:44:56,480 --> 00:45:00,120 there's no Ivy Leagues professors who are big, burly 837 00:45:00,120 --> 00:45:02,640 guys with tattoos and stuff. 838 00:45:02,640 --> 00:45:04,870 Or women, for that matter. 839 00:45:04,870 --> 00:45:09,110 Probably half of them are women, so this is already -- 840 00:45:09,110 --> 00:45:11,520 What you've done here is you've ignored what's known as 841 00:45:11,520 --> 00:45:13,130 the base rate. 842 00:45:13,130 --> 00:45:20,000 All else being equal, it's more likely that, if I just 843 00:45:20,000 --> 00:45:24,630 say, here's Henry, is it more likely that Henry is a truck 844 00:45:24,630 --> 00:45:26,620 driver or an Ivy League classics professor? 845 00:45:26,620 --> 00:45:28,950 It's way, way, way more likely that he's a truck driver. 846 00:45:28,950 --> 00:45:31,300 In fact, it's so much more likely that he's a truck 847 00:45:31,300 --> 00:45:34,240 driver that it's probably more likely. 848 00:45:34,240 --> 00:45:35,990 It's almost undoubtedly more likely. 849 00:45:35,990 --> 00:45:40,140 Even for this description of Henry the amazing feminist, 850 00:45:40,140 --> 00:45:42,630 whatever he was. 851 00:45:42,630 --> 00:45:45,740 You see, this is what's known as the base rate fallacy. 852 00:45:45,740 --> 00:45:50,700 People do not tend to take into account the base rate in 853 00:45:50,700 --> 00:45:53,700 the population when they're thinking about this. 854 00:45:53,700 --> 00:45:56,660 I'm asking how likely it is, and you're answering a 855 00:45:56,660 --> 00:45:58,070 different question. 856 00:45:58,070 --> 00:46:04,830 You're answering the question, how typical is Henry of your 857 00:46:04,830 --> 00:46:11,010 image, of your stereotype, of a truck driver versus of an 858 00:46:11,010 --> 00:46:14,010 Ivy League professor? 859 00:46:14,010 --> 00:46:17,490 And realize that even if -- let's suppose your 860 00:46:17,490 --> 00:46:19,370 stereotype is right. 861 00:46:19,370 --> 00:46:23,700 What's the chance that Henry could be a truck driver? 862 00:46:23,700 --> 00:46:25,610 Maybe it's only 1 in 100. 863 00:46:25,610 --> 00:46:27,690 All right, if it's 1 in 100, there's still 10 864 00:46:27,690 --> 00:46:29,820 to the 3rd of them. 865 00:46:29,820 --> 00:46:32,970 And we're still swamping the population here. 866 00:46:32,970 --> 00:46:38,040 So, failing to take into account the base rate is going 867 00:46:38,040 --> 00:46:40,490 to lead you to the wrong answer there's. 868 00:46:40,490 --> 00:46:44,620 Let's look at another piece of these data. 869 00:46:44,620 --> 00:46:46,810 Let's look at -- 870 00:46:52,440 --> 00:47:01,850 Is it more likely, A and E. A, E. How many people said that A 871 00:47:01,850 --> 00:47:05,070 was more likely than E? 872 00:47:05,070 --> 00:47:07,980 How many people said that E was more likely than A? 873 00:47:07,980 --> 00:47:09,820 Yeah, well this is this is Kahneman got the 874 00:47:09,820 --> 00:47:10,940 Nobel Prize, you see. 875 00:47:10,940 --> 00:47:12,090 Because he said that, too. 876 00:47:12,090 --> 00:47:13,060 Well, he didn't say that. 877 00:47:13,060 --> 00:47:14,950 He said, that's what you would say. 878 00:47:14,950 --> 00:47:15,930 Or he found, anyway. 879 00:47:15,930 --> 00:47:19,560 So E is much more likely. 880 00:47:19,560 --> 00:47:24,880 Remember, like, third grade set theory stuff? 881 00:47:24,880 --> 00:47:29,120 Let's make the little picture here. 882 00:47:29,120 --> 00:47:30,180 All right. 883 00:47:30,180 --> 00:47:34,960 The set of all truck drivers, right? 884 00:47:34,960 --> 00:47:36,960 The set of, I can't even remember what E is. 885 00:47:36,960 --> 00:47:43,370 E's like Mensa Ivy League truck drivers or something. 886 00:47:43,370 --> 00:47:51,740 It's some small little set, it's a subset of A. 887 00:47:51,740 --> 00:47:57,340 So, for starters it's a little unlikely that the chance of 888 00:47:57,340 --> 00:48:00,280 being in here could be greater than the 889 00:48:00,280 --> 00:48:04,010 chance of being in here. 890 00:48:04,010 --> 00:48:06,120 The best it's going to be is equal. 891 00:48:06,120 --> 00:48:07,370 In some fashion. 892 00:48:09,760 --> 00:48:10,390 Here's Henry. 893 00:48:10,390 --> 00:48:13,580 I'm meeting some random Henry. 894 00:48:13,580 --> 00:48:16,680 What's the chance that he's in E? 895 00:48:16,680 --> 00:48:20,190 It's just not -- 896 00:48:20,190 --> 00:48:22,900 Again, you're answering a question that says something 897 00:48:22,900 --> 00:48:26,710 about typical rather than probable. 898 00:48:26,710 --> 00:48:30,270 And you're ignoring what you would perfectly well know, of 899 00:48:30,270 --> 00:48:34,050 course, if we drew it out in set theory terms. 900 00:48:34,050 --> 00:48:40,150 You can't be more likely that something is in here, if the 901 00:48:40,150 --> 00:48:43,960 larger set includes that. 902 00:48:43,960 --> 00:48:45,880 That doesn't work really well. 903 00:48:45,880 --> 00:48:52,200 So, people are very bad at base rates. 904 00:48:52,200 --> 00:48:53,950 Is this a bad -- 905 00:48:53,950 --> 00:48:59,960 is the purpose of this lecture to say we're all morons here? 906 00:48:59,960 --> 00:49:00,950 No, not really. 907 00:49:00,950 --> 00:49:05,520 The faults we have, the problems that we have in the 908 00:49:05,520 --> 00:49:11,340 way that we reason about problems like this, reflect -- 909 00:49:11,340 --> 00:49:16,170 What Kahneman and Tversky talked about were, if I'm 910 00:49:16,170 --> 00:49:19,420 spelling it right, heuristics. 911 00:49:19,420 --> 00:49:24,400 Sort of, mental shortcuts that do a certain 912 00:49:24,400 --> 00:49:28,000 amount of work for us. 913 00:49:28,000 --> 00:49:31,340 Our job isn't actually out there in the world to figure 914 00:49:31,340 --> 00:49:38,050 out set theory problems. 915 00:49:38,050 --> 00:49:42,210 Our problem is to get home safely at night. 916 00:49:42,210 --> 00:49:45,770 So, you're walking down the street at night. 917 00:49:45,770 --> 00:49:49,110 And it's a dark kind of street. 918 00:49:49,110 --> 00:49:53,390 And you see this big guy with a stick. 919 00:49:53,390 --> 00:49:57,370 And he's walking down the same side of the street as you. 920 00:49:57,370 --> 00:49:59,570 Do you cross the street? 921 00:49:59,570 --> 00:50:02,910 Well, if you're in a Kahneman and Tversky kind of experiment 922 00:50:02,910 --> 00:50:05,170 you say, I know what I'm supposed to do here. 923 00:50:05,170 --> 00:50:08,090 I'm supposed to think hard about the base rates. 924 00:50:08,090 --> 00:50:11,500 Is it more likely, how many people are there in the world 925 00:50:11,500 --> 00:50:16,280 who are bad, evil, people with sticks that are going to, 926 00:50:16,280 --> 00:50:18,020 like, hit me. 927 00:50:18,020 --> 00:50:21,560 Versus, how many people are there who are, like, guys 928 00:50:21,560 --> 00:50:24,370 coming home from a softball game with their bat, or 929 00:50:24,370 --> 00:50:26,620 something like that? 930 00:50:26,620 --> 00:50:29,150 And they're nice people, mostly. 931 00:50:29,150 --> 00:50:31,200 Or at least even if they're bad, evil people, they're not 932 00:50:31,200 --> 00:50:33,080 really going to hit me. 933 00:50:33,080 --> 00:50:36,840 The answer is, there's a very small population of people out 934 00:50:36,840 --> 00:50:41,890 there with sticks wanting to hit you. 935 00:50:41,890 --> 00:50:44,580 But, in the occasion that you get that wrong, it's a really 936 00:50:44,580 --> 00:50:47,660 bad mistake to make. 937 00:50:47,660 --> 00:50:54,880 And so, a mental shortcut that says not, what's the base rate 938 00:50:54,880 --> 00:50:59,300 here, but is this typical of a situation that could be 939 00:50:59,300 --> 00:51:02,120 dangerous, sort of, flipping it around. 940 00:51:02,120 --> 00:51:04,200 Could this be a bad thing. 941 00:51:04,200 --> 00:51:08,110 If I make a mistake and he's a really a nice person, so you 942 00:51:08,110 --> 00:51:10,110 might feel a little hurt that I crossed to the other side of 943 00:51:10,110 --> 00:51:10,660 the street. 944 00:51:10,660 --> 00:51:11,520 What's the big deal. 945 00:51:11,520 --> 00:51:14,160 If he's a guy with a bat and a nail in it and he's going to 946 00:51:14,160 --> 00:51:15,740 poke me in the head, and terrible things 947 00:51:15,740 --> 00:51:17,080 are going to happen. 948 00:51:17,080 --> 00:51:19,610 I'm not going to worry about the base rates here. 949 00:51:19,610 --> 00:51:27,180 And your cognitive hardware seems to have been set up 950 00:51:27,180 --> 00:51:29,720 under those sort of constraints more than it was 951 00:51:29,720 --> 00:51:37,150 set up under what might be optimal public policy solving 952 00:51:37,150 --> 00:51:38,320 constraints. 953 00:51:38,320 --> 00:51:45,440 So, this is why you can then run a political campaign where 954 00:51:45,440 --> 00:51:48,370 one side or the other spends a lot of time conjuring up 955 00:51:48,370 --> 00:51:51,190 images of bad guys with sticks, of 956 00:51:51,190 --> 00:51:53,330 some variety or other. 957 00:51:53,330 --> 00:51:56,470 Because you hear about the bad guy with 958 00:51:56,470 --> 00:51:58,160 the stick often enough. 959 00:51:58,160 --> 00:52:01,020 And if this guy promises he's going to save you from the bad 960 00:52:01,020 --> 00:52:03,740 guy with the stick, well, you'd better go with that. 961 00:52:03,740 --> 00:52:06,290 At least, that's an appealing thought. 962 00:52:06,290 --> 00:52:07,860 And it works for them. 963 00:52:07,860 --> 00:52:15,350 It works as a political technique. 964 00:52:15,350 --> 00:52:19,140 All right, this is all -- who knows about Henry the, some 965 00:52:19,140 --> 00:52:21,270 weird truck driver that I invented or 966 00:52:21,270 --> 00:52:22,240 something like that? 967 00:52:22,240 --> 00:52:31,010 How about a realm where things really ought to be logical and 968 00:52:31,010 --> 00:52:32,150 mathematical. 969 00:52:32,150 --> 00:52:34,410 Which would be money. 970 00:52:34,410 --> 00:52:36,130 I mean, if there was ever going to be something that 971 00:52:36,130 --> 00:52:39,920 ought to just work out in terms of the math, you would 972 00:52:39,920 --> 00:52:41,960 think it would be money. 973 00:52:41,960 --> 00:52:45,570 And this is, of course, the reason why there is no Nobel 974 00:52:45,570 --> 00:52:47,630 Prize in psychology. 975 00:52:47,630 --> 00:52:49,730 But there is a Nobel Prize in economics. 976 00:52:49,730 --> 00:52:53,340 And that's what Kahneman, in fact, won. 977 00:52:53,340 --> 00:52:57,350 Kaheman and Tversky would have won it, but Tversky, 978 00:52:57,350 --> 00:52:59,810 unfortunately, died young. 979 00:52:59,810 --> 00:53:03,720 And you don't win the Nobel Prize posthumously. 980 00:53:03,720 --> 00:53:07,190 So, Kahneman and Tversky also looked at the way that people 981 00:53:07,190 --> 00:53:08,040 reason about money. 982 00:53:08,040 --> 00:53:11,560 This is something that goes back long before 983 00:53:11,560 --> 00:53:13,810 Kahneman and Tversky. 984 00:53:13,810 --> 00:53:16,600 And, I can perhaps illustrate one of the older 985 00:53:16,600 --> 00:53:18,530 bits with an example. 986 00:53:18,530 --> 00:53:24,440 Let's suppose that you're going to buy a toy car for 987 00:53:24,440 --> 00:53:27,430 your cousin. 988 00:53:27,430 --> 00:53:30,110 You like your cousin, just in case you were wondering here. 989 00:53:30,110 --> 00:53:32,610 So you're buying this toy car for your cousin. 990 00:53:32,610 --> 00:53:41,560 And there are two toy cars out there at the moment. 991 00:53:41,560 --> 00:53:45,800 They are, for present purposes, functionally 992 00:53:45,800 --> 00:53:49,440 equivalent on all dimensions of kid 993 00:53:49,440 --> 00:53:51,990 loveliness or something. 994 00:53:51,990 --> 00:54:01,950 There's the midnight blue, I don't know, Model PT roadster 995 00:54:01,950 --> 00:54:06,470 or something from Toys R Us. 996 00:54:06,470 --> 00:54:17,570 And the midnight green Model PT roadster from Kids Are 997 00:54:17,570 --> 00:54:18,820 Toys, or something. 998 00:54:27,830 --> 00:54:30,460 So, the difference is the color. 999 00:54:30,460 --> 00:54:32,790 The green color is cooler. 1000 00:54:32,790 --> 00:54:38,420 You know your cousin actually likes the green one better. 1001 00:54:38,420 --> 00:54:39,980 It's a bit cooler. 1002 00:54:39,980 --> 00:54:46,130 Anyway, the midnight blue one costs $12 and the midnight 1003 00:54:46,130 --> 00:54:49,610 green one costs $40. 1004 00:54:49,610 --> 00:54:53,360 How many people are going to buy one of them. 1005 00:54:53,360 --> 00:54:54,830 You have to pick one, again. 1006 00:54:54,830 --> 00:54:58,050 How many pick the blue one? 1007 00:54:58,050 --> 00:55:00,110 How many people pick the green one? 1008 00:55:00,110 --> 00:55:00,490 OK. 1009 00:55:00,490 --> 00:55:03,520 You guys have nice cousins, that's good. 1010 00:55:03,520 --> 00:55:08,690 All right, the blue one wins big-time. 1011 00:55:08,690 --> 00:55:09,900 All right. 1012 00:55:09,900 --> 00:55:14,280 You now have magically graduated from MIT. 1013 00:55:14,280 --> 00:55:16,570 Congratulations. 1014 00:55:16,570 --> 00:55:18,160 You got the job. 1015 00:55:18,160 --> 00:55:22,180 You're now going to buy the car. 1016 00:55:22,180 --> 00:55:23,880 And so, you're going to buy -- 1017 00:55:23,880 --> 00:55:27,030 you're going to buy a real PT roadster. 1018 00:55:27,030 --> 00:55:31,790 and And they come in two different colors. 1019 00:55:31,790 --> 00:55:33,820 Amazingly, for today's purposes, they come in 1020 00:55:33,820 --> 00:55:35,260 midnight blue and midnight green. 1021 00:55:35,260 --> 00:55:41,120 The midnight blue one costs $30,012. 1022 00:55:41,120 --> 00:55:48,460 The midnight green one costs $30,040. 1023 00:55:48,460 --> 00:55:50,860 You like the green one better. 1024 00:55:50,860 --> 00:55:54,430 How many people are going to buy the green one? 1025 00:55:54,430 --> 00:55:58,190 OK, so you graduated from MIT with this brain. 1026 00:55:58,190 --> 00:56:00,220 It's the same. 1027 00:56:00,220 --> 00:56:02,060 So, obviously it's now going very much 1028 00:56:02,060 --> 00:56:02,950 in the other direction. 1029 00:56:02,950 --> 00:56:06,920 It's the same $28. 1030 00:56:06,920 --> 00:56:08,540 What's your problem here? 1031 00:56:08,540 --> 00:56:10,790 AUDIENCE: [UNINTELLIGIBLE] 1032 00:56:13,630 --> 00:56:15,220 PROFESSOR: You were going to -- 1033 00:56:15,220 --> 00:56:18,813 AUDIENCE: You're already paying over $30,000 so the $20 1034 00:56:18,813 --> 00:56:21,380 just seems very trivial. 1035 00:56:21,380 --> 00:56:22,440 PROFESSOR: Yeah. 1036 00:56:22,440 --> 00:56:23,570 It's the same $20. 1037 00:56:23,570 --> 00:56:25,820 This is absolutely right, of course. 1038 00:56:25,820 --> 00:56:27,823 But yeah. 1039 00:56:27,823 --> 00:56:29,073 AUDIENCE: [INAUDIBLE] 1040 00:56:35,140 --> 00:56:36,660 PROFESSOR: The toy car is probably going to last you 1041 00:56:36,660 --> 00:56:38,020 least as long as the roadster. 1042 00:56:42,130 --> 00:56:45,340 I keep trying to fine-tune this to make this, you're not 1043 00:56:45,340 --> 00:56:47,620 going to buy 15 cars here, and -- 1044 00:56:47,620 --> 00:56:50,560 We're assuming a one-time purchase here, a one-time 1045 00:56:50,560 --> 00:56:52,670 purchase here. 1046 00:56:52,670 --> 00:56:57,630 And they're each going to last three years, or 100,000 miles, 1047 00:56:57,630 --> 00:56:59,040 or whichever comes first, or something. 1048 00:56:59,040 --> 00:56:59,320 Yeah. 1049 00:56:59,320 --> 00:57:00,570 AUDIENCE: [UNINTELLIGIBLE] 1050 00:57:04,370 --> 00:57:07,750 PROFESSOR: That much more money? 1051 00:57:07,750 --> 00:57:08,850 All right, all right. 1052 00:57:08,850 --> 00:57:11,470 Would your answer have changed radically if I say, 1053 00:57:11,470 --> 00:57:14,010 congratulations you have just graduated. 1054 00:57:14,010 --> 00:57:17,240 It's also your cousin's birthday. 1055 00:57:17,240 --> 00:57:19,930 You'd all of a sudden be -- 1056 00:57:19,930 --> 00:57:23,020 Boy, there must me sort of a U-shaped function. 1057 00:57:23,020 --> 00:57:27,990 Because by the time you get to a parent or something, ask 1058 00:57:27,990 --> 00:57:31,060 your parents, who graduated a while ago. 1059 00:57:31,060 --> 00:57:35,550 Hey, Mommy, can I have the midnight green car, it only 1060 00:57:35,550 --> 00:57:39,930 costs three times what the other one cost. 1061 00:57:39,930 --> 00:57:42,520 Oh, actually I just sort of gave away the answer. 1062 00:57:42,520 --> 00:57:44,770 It's the three times issue. 1063 00:57:44,770 --> 00:57:46,930 And you were alluding to this already. 1064 00:57:46,930 --> 00:57:51,310 That money is perceived, even though it's obviously the same 1065 00:57:51,310 --> 00:57:57,030 $28, that people think about money in ratio terms and not 1066 00:57:57,030 --> 00:57:57,890 in absolute terms. 1067 00:57:57,890 --> 00:58:02,040 This was actually picked up by Bernoulli a couple of 1068 00:58:02,040 --> 00:58:03,860 hundred years ago. 1069 00:58:03,860 --> 00:58:04,980 I don't know which Bernoulli. 1070 00:58:04,980 --> 00:58:08,740 It turns out that there were buckets of Bernoullis. 1071 00:58:08,740 --> 00:58:13,040 Five Bernoullis, and they were all geniuses. 1072 00:58:13,040 --> 00:58:24,720 So, but one of the Bernoullis basically asked, what is the 1073 00:58:24,720 --> 00:58:26,610 psychological value of money? 1074 00:58:26,610 --> 00:58:29,810 Actually, in Kahneman and Tversky language, that's often 1075 00:58:29,810 --> 00:58:34,420 called the utility of money as a function of actual money. 1076 00:58:34,420 --> 00:58:41,460 And what Bernoulli asserted was that the utility of money, 1077 00:58:41,460 --> 00:58:44,910 the psychological value of the money, basically 1078 00:58:44,910 --> 00:58:49,030 went up with log money. 1079 00:58:49,030 --> 00:58:56,140 So it's basically preserving ratios and not the absolute. 1080 00:58:56,140 --> 00:59:01,030 It's not preserving the absolute values. 1081 00:59:01,030 --> 00:59:06,520 Which, by the way, when you go out to buy that first car when 1082 00:59:06,520 --> 00:59:08,590 you have all this money that you're apparently going to 1083 00:59:08,590 --> 00:59:10,550 have when you graduate. 1084 00:59:10,550 --> 00:59:16,110 The guy selling you the car knows all about this. 1085 00:59:16,110 --> 00:59:24,650 And he knows that, do you want the plain vanilla one, or do 1086 00:59:24,650 --> 00:59:28,360 you want the PT roadster with the really cool 1087 00:59:28,360 --> 00:59:30,940 racing stripe on it. 1088 00:59:30,940 --> 00:59:37,000 And this is part of our cool new graduate package, of stuff 1089 00:59:37,000 --> 00:59:42,490 that cost us $3.95 and we'll crank the price up by $500 1090 00:59:42,490 --> 00:59:46,080 here, because on $30,000, who can tell, right? 1091 00:59:46,080 --> 00:59:48,250 And he knows. 1092 00:59:48,250 --> 00:59:51,530 He knows that it's the same $28 here and here. 1093 00:59:51,530 --> 00:59:54,560 And his job is to get as many of those $28 1094 00:59:54,560 --> 00:59:57,790 out of you as possible. 1095 00:59:57,790 --> 01:00:01,770 And will take advantage of exactly this fact. 1096 01:00:01,770 --> 01:00:05,360 If somebody said, I'll put a little stripe on your toy car 1097 01:00:05,360 --> 01:00:09,440 for an extra $28, well it's a smaller car. 1098 01:00:09,440 --> 01:00:13,220 I'll put two stripes on it because it's a little car. 1099 01:00:13,220 --> 01:00:15,830 You'll say, get away from me. 1100 01:00:15,830 --> 01:00:20,290 But just listen to yourself go for the options when you go 1101 01:00:20,290 --> 01:00:25,790 off and buy that car eventually. 1102 01:00:25,790 --> 01:00:27,270 All right. 1103 01:00:27,270 --> 01:00:30,150 let us take a momentary break. 1104 01:00:30,150 --> 01:00:33,570 And then I will say a word more about the oddities of 1105 01:00:33,570 --> 01:00:35,476 money here. 1106 01:00:35,476 --> 01:01:42,510 [RANDOM NOISE] 1107 01:01:42,510 --> 01:01:43,760 OK. 1108 01:01:47,530 --> 01:02:04,690 Let us contemplate a little more of the way in which the 1109 01:02:04,690 --> 01:02:06,970 -- now, the reason this is important. 1110 01:02:06,970 --> 01:02:10,080 The reason that this is the sort of stuff that eventually 1111 01:02:10,080 --> 01:02:14,360 gets -- that where working this out is worth the 1112 01:02:14,360 --> 01:02:20,050 attention of the Nobel committee in economics is that 1113 01:02:20,050 --> 01:02:25,680 economists, for a long time, had this notion, had a 1114 01:02:25,680 --> 01:02:29,300 psychological theory, in effect, of what people were 1115 01:02:29,300 --> 01:02:30,140 doing economically. 1116 01:02:30,140 --> 01:02:35,040 And it was the notion of a rational consumer, a rational 1117 01:02:35,040 --> 01:02:38,280 person who's making decisions that were in a sense 1118 01:02:38,280 --> 01:02:40,020 propositional. 1119 01:02:40,020 --> 01:02:44,590 That they were doing the math to the best of their ability, 1120 01:02:44,590 --> 01:02:46,550 and working things out on that basis. 1121 01:02:46,550 --> 01:02:50,930 And this the Kahneman and Tversky program of research 1122 01:02:50,930 --> 01:02:54,740 tells you is, the very interesting constraints on 1123 01:02:54,740 --> 01:02:56,630 that rationality. 1124 01:02:56,630 --> 01:03:00,730 The ways in which it sort of systematically deviates from 1125 01:03:00,730 --> 01:03:03,360 what your rational computer might think about the same 1126 01:03:03,360 --> 01:03:04,510 kind of question. 1127 01:03:04,510 --> 01:03:07,040 So, consider the following. 1128 01:03:07,040 --> 01:03:13,310 Let's think about doing some gambling games. 1129 01:03:13,310 --> 01:03:15,000 Where this is a one-shot game. 1130 01:03:15,000 --> 01:03:16,700 You only get a chance to play this game once. 1131 01:03:16,700 --> 01:03:17,520 I got a coin. 1132 01:03:17,520 --> 01:03:19,910 It's a fair coin. 1133 01:03:19,910 --> 01:03:28,090 If I flip it and it comes up heads, I give you a penny. 1134 01:03:28,090 --> 01:03:31,250 If it comes up tails, you give me a penny. 1135 01:03:31,250 --> 01:03:33,950 How many people are willing to play that game? 1136 01:03:38,420 --> 01:03:39,200 Few people. 1137 01:03:39,200 --> 01:03:42,090 AUDIENCE: [UNINTELLIGIBLE] 1138 01:03:42,090 --> 01:03:42,540 PROFESSOR: No, no. 1139 01:03:42,540 --> 01:03:43,080 One shot. 1140 01:03:43,080 --> 01:03:45,470 One time. 1141 01:03:45,470 --> 01:03:49,490 The whole game is this one penny. 1142 01:03:49,490 --> 01:03:54,720 How many people are willing to play one round of this game? 1143 01:03:54,720 --> 01:03:58,270 AUDIENCE: Give me the quarter instead. 1144 01:03:58,270 --> 01:04:00,360 PROFESSOR: Oh, she wants to play for real money. 1145 01:04:00,360 --> 01:04:01,450 OK. 1146 01:04:01,450 --> 01:04:02,480 That's fine. 1147 01:04:02,480 --> 01:04:04,340 Let's do that instead. 1148 01:04:04,340 --> 01:04:06,080 So. 1149 01:04:06,080 --> 01:04:08,130 Most people are willing to play that, though. 1150 01:04:08,130 --> 01:04:10,990 By this point in the lecture a variety of people are starting 1151 01:04:10,990 --> 01:04:14,320 to wonder what is that I'm revealing about myself if I 1152 01:04:14,320 --> 01:04:17,660 say I'll do this. 1153 01:04:17,660 --> 01:04:23,280 So, the way you write that, you could write that as, 1154 01:04:23,280 --> 01:04:29,870 you've got a 0.5 chance of winning a penny and a 0.5 1155 01:04:29,870 --> 01:04:32,410 chance of losing a penny. 1156 01:04:32,410 --> 01:04:35,990 And the expected value of this is 0.5 -- 1157 01:04:35,990 --> 01:04:37,950 you can figure that out, it's 0. 1158 01:04:37,950 --> 01:04:39,230 It's a fair game. 1159 01:04:39,230 --> 01:04:42,810 OK, let's play it again. 1160 01:04:42,810 --> 01:04:44,250 We'll flip, we'll play for her. 1161 01:04:44,250 --> 01:04:45,540 Flip the coin. 1162 01:04:45,540 --> 01:04:48,860 Heads, I give you $100. 1163 01:04:48,860 --> 01:04:51,130 Tails, you give me $100. 1164 01:04:51,130 --> 01:04:52,380 How many people want to play? 1165 01:04:56,080 --> 01:04:58,340 Your computer doesn't care much about this, because the 1166 01:04:58,340 --> 01:05:00,760 expected value remains 0, right? 1167 01:05:00,760 --> 01:05:04,024 What's the problem? 1168 01:05:04,024 --> 01:05:05,274 AUDIENCE: [UNINTELLIGIBLE] 1169 01:05:08,490 --> 01:05:10,080 PROFESSOR: People are risk averse. 1170 01:05:10,080 --> 01:05:13,120 Yeah, that's certainly one. 1171 01:05:13,120 --> 01:05:15,340 That's what you find here. 1172 01:05:15,340 --> 01:05:20,520 But why don't you once want to play? 1173 01:05:20,520 --> 01:05:21,770 AUDIENCE: [UNINTELLIGIBLE] 1174 01:05:26,630 --> 01:05:29,610 PROFESSOR: No, the expected value is just a thing defined 1175 01:05:29,610 --> 01:05:31,530 in math land or statistics land. 1176 01:05:31,530 --> 01:05:35,230 It's zero on this. 1177 01:05:35,230 --> 01:05:38,180 But the two, you could either win or lose, it's true. 1178 01:05:38,180 --> 01:05:39,950 You're not going to come out at 0. 1179 01:05:39,950 --> 01:05:43,080 But if we played it for everybody in the classroom, if 1180 01:05:43,080 --> 01:05:45,830 I had a lot of $100 bills. 1181 01:05:45,830 --> 01:05:47,250 Gee, I wouldn't want to play that game. 1182 01:05:47,250 --> 01:05:47,570 Yeah. 1183 01:05:47,570 --> 01:05:51,698 AUDIENCE: The expected utility of playing is less than the 1184 01:05:51,698 --> 01:05:55,310 expected utility of not playing. 1185 01:05:55,310 --> 01:05:55,920 PROFESSOR: Hmm. 1186 01:05:55,920 --> 01:05:57,400 That sounded cool. 1187 01:06:00,440 --> 01:06:02,730 But I'm not being quick enough to figure this out. 1188 01:06:02,730 --> 01:06:04,550 What did you -- 1189 01:06:04,550 --> 01:06:07,630 AUDIENCE: I'm saying that winning $100 is less good than 1190 01:06:07,630 --> 01:06:07,700 losing $100. 1191 01:06:07,700 --> 01:06:07,880 PROFESSOR: Thank you. 1192 01:06:07,880 --> 01:06:11,460 That's exactly right. 1193 01:06:11,460 --> 01:06:14,140 It turns out that this curve is not 1194 01:06:14,140 --> 01:06:16,800 symmetrical around the origin. 1195 01:06:16,800 --> 01:06:21,570 That negative money, the utility of negative money, 1196 01:06:21,570 --> 01:06:25,340 tends to be somewhat steeper and more linear 1197 01:06:25,340 --> 01:06:28,010 than positive money. 1198 01:06:28,010 --> 01:06:33,380 With the result that, exactly what the gentleman said. 1199 01:06:33,380 --> 01:06:41,720 That losing, that the absolute value of losing $100 on this 1200 01:06:41,720 --> 01:06:47,720 scale is greater than the absolute 1201 01:06:47,720 --> 01:06:51,150 value of gaining $100. 1202 01:06:51,150 --> 01:06:55,080 I mean, one way to intuit this is, there's plenty of room for 1203 01:06:55,080 --> 01:06:56,940 you to gain $100. 1204 01:06:56,940 --> 01:06:58,220 You can do that all the time. 1205 01:06:58,220 --> 01:07:02,770 Do you have $100 handy that you can afford to lose? 1206 01:07:02,770 --> 01:07:04,470 Maybe not. 1207 01:07:04,470 --> 01:07:06,720 It would just hurt a lot more. 1208 01:07:06,720 --> 01:07:08,060 And if we decide -- 1209 01:07:08,060 --> 01:07:10,490 And now we can -- 1210 01:07:10,490 --> 01:07:12,440 There are some versions of this -- it's not that you'll 1211 01:07:12,440 --> 01:07:14,260 never play a game for this kind of stakes. 1212 01:07:14,260 --> 01:07:17,600 I mean, if I say I'll flip a coin. 1213 01:07:17,600 --> 01:07:22,170 Heads, you give me a penny, tails, I give you $100, how 1214 01:07:22,170 --> 01:07:24,980 many people want to play? 1215 01:07:24,980 --> 01:07:26,780 And the other ones didn't figure out what I said, 1216 01:07:26,780 --> 01:07:29,700 presumably. 1217 01:07:29,700 --> 01:07:34,200 So, somewhere between that extreme and the even game is 1218 01:07:34,200 --> 01:07:37,060 the point at which we would measure how 1219 01:07:37,060 --> 01:07:38,030 risk averse you are. 1220 01:07:38,030 --> 01:07:40,160 It'd be different for different people. 1221 01:07:40,160 --> 01:07:46,110 I mean, would you play if heads, one time play, heads 1222 01:07:46,110 --> 01:07:48,870 you give me $50, tails I give you $100? 1223 01:07:48,870 --> 01:07:52,260 Now, some people would be willing to play. 1224 01:07:52,260 --> 01:07:53,600 He's good, he's got $50 handy. 1225 01:07:53,600 --> 01:07:55,250 Or do you have a question? 1226 01:07:55,250 --> 01:07:55,840 No, you were just -- 1227 01:07:55,840 --> 01:07:56,000 AUDIENCE: [UNINTELLIGIBLE] 1228 01:07:56,000 --> 01:07:57,700 198. 1229 01:07:57,700 --> 01:07:58,910 PROFESSOR: 198, OK. 1230 01:07:58,910 --> 01:08:01,040 He's got -- talk to him later. 1231 01:08:01,040 --> 01:08:05,000 He's got more money than he knows what to do with. 1232 01:08:05,000 --> 01:08:09,510 But, in any case, there will be a balance point 1233 01:08:09,510 --> 01:08:12,090 at which your -- 1234 01:08:12,090 --> 01:08:14,950 where the loss and the gain balance out. 1235 01:08:14,950 --> 01:08:18,050 And then you'd be presumably willing to pay. 1236 01:08:18,050 --> 01:08:23,000 Strange things happen at the extremes. 1237 01:08:23,000 --> 01:08:28,430 And that's the basis for a willingness to play lotteries. 1238 01:08:28,430 --> 01:08:39,290 So, if you have a wager that says, heads you give me $1, 1239 01:08:39,290 --> 01:08:47,360 tails, I give you $10 million, but the probability of coming 1240 01:08:47,360 --> 01:08:55,980 up tails equal 10 to the minus, I don't know, 80 or 1241 01:08:55,980 --> 01:08:57,230 something like that. 1242 01:08:59,620 --> 01:09:02,820 I'm not getting the right numbers, particularly, for a 1243 01:09:02,820 --> 01:09:03,530 state lottery. 1244 01:09:03,530 --> 01:09:04,520 But it's not -- 1245 01:09:04,520 --> 01:09:06,000 it's that kind of thing. 1246 01:09:06,000 --> 01:09:08,580 The reason people run state lotteries is because they want 1247 01:09:08,580 --> 01:09:09,190 your dollars. 1248 01:09:09,190 --> 01:09:14,310 Not because they're interested in giving you a pile of money. 1249 01:09:14,310 --> 01:09:17,730 So, even so by the time you get these very extreme kinds 1250 01:09:17,730 --> 01:09:23,080 of gambles, as long as the cost is sort of minimal down 1251 01:09:23,080 --> 01:09:26,030 here and the potential payoff is huge, for instance, you're 1252 01:09:26,030 --> 01:09:28,710 willing to pay radically unfair games that you would 1253 01:09:28,710 --> 01:09:37,890 never play in the range of, in the middle kind of range. 1254 01:09:37,890 --> 01:09:47,020 If I tell you, well all right, we can make this just a 1255 01:09:47,020 --> 01:09:48,450 tenfold difference. 1256 01:09:48,450 --> 01:09:51,210 So if we play a game that -- 1257 01:09:51,210 --> 01:09:52,500 you can figure this out. 1258 01:09:52,500 --> 01:09:54,370 At the extremes you're willing to risk $1 1259 01:09:54,370 --> 01:09:55,620 to make $100 million. 1260 01:09:57,920 --> 01:09:59,800 Even if the odds are against you. 1261 01:09:59,800 --> 01:10:02,210 In a more moderate game, you would not be 1262 01:10:02,210 --> 01:10:04,650 willing to do that. 1263 01:10:04,650 --> 01:10:07,940 And it's that sort of reasoning, or lack thereof, 1264 01:10:07,940 --> 01:10:12,750 that allows lotteries to make their money. 1265 01:10:12,750 --> 01:10:14,040 Even -- 1266 01:10:14,040 --> 01:10:18,310 it's probably also confounded there by the fact it it's not 1267 01:10:18,310 --> 01:10:21,900 clear that the population as a whole really understands that 1268 01:10:21,900 --> 01:10:23,580 this is a sucker's game. 1269 01:10:23,580 --> 01:10:26,450 That the state is actually in the business of taking your 1270 01:10:26,450 --> 01:10:30,150 money from you, not just redistributing the wealth. 1271 01:10:30,150 --> 01:10:35,220 But even people who perfectly well understand that their 1272 01:10:35,220 --> 01:10:38,690 chances are less than even of winning, even if they were to 1273 01:10:38,690 --> 01:10:42,890 buy 10 to the 80th tickets, continue to buy lottery 1274 01:10:42,890 --> 01:10:43,950 tickets and continue to. 1275 01:10:43,950 --> 01:10:46,250 It's a strange way to raise money for your state, but 1276 01:10:46,250 --> 01:10:49,190 never mind. 1277 01:10:49,190 --> 01:10:55,330 So, money is like other aspects of reasoning. 1278 01:10:55,330 --> 01:11:00,660 Is subject to the sorts of narrative stories that say, it 1279 01:11:00,660 --> 01:11:07,360 hurts more to lose $100 than it feels good to gain $100. 1280 01:11:07,360 --> 01:11:12,990 The sort of narrative that is beyond just the simple math of 1281 01:11:12,990 --> 01:11:15,810 the situation. 1282 01:11:15,810 --> 01:11:17,530 Let me -- 1283 01:11:17,530 --> 01:11:20,590 OK, I have time to do one more example. 1284 01:11:20,590 --> 01:11:23,250 Which is an interestingly illustrative one. 1285 01:11:23,250 --> 01:11:24,950 And involves these cards. 1286 01:11:24,950 --> 01:11:26,800 Which I believe I put on the handout. 1287 01:11:26,800 --> 01:11:30,060 This is known as the Wason selection task, because good 1288 01:11:30,060 --> 01:11:32,740 old Wason developed it. 1289 01:11:32,740 --> 01:11:34,820 And, oh look, the rules are properly 1290 01:11:34,820 --> 01:11:36,340 described on the handout. 1291 01:11:36,340 --> 01:11:39,910 You've got four cards here. 1292 01:11:39,910 --> 01:11:43,940 Each card has a letter on one side and a number 1293 01:11:43,940 --> 01:11:45,080 on the other side. 1294 01:11:45,080 --> 01:11:48,140 You know that. 1295 01:11:48,140 --> 01:11:53,750 What you want to know is, and here's the rule. 1296 01:11:53,750 --> 01:12:03,260 The rule is, if the card has a vowel on one side, then it has 1297 01:12:03,260 --> 01:12:06,740 an odd number on the other side. 1298 01:12:06,740 --> 01:12:10,480 That's the rule that we want to check. 1299 01:12:10,480 --> 01:12:15,710 Which cards, what's the minimum and full set of cards 1300 01:12:15,710 --> 01:12:20,750 that you need to flip to check whether that rule pertains for 1301 01:12:20,750 --> 01:12:22,360 this set of cards. 1302 01:12:22,360 --> 01:12:23,910 That's the question. 1303 01:12:23,910 --> 01:12:27,040 And so we'll vote on each card as we go along. 1304 01:12:27,040 --> 01:12:32,400 How many people want to flip E? 1305 01:12:32,400 --> 01:12:32,750 All right. 1306 01:12:32,750 --> 01:12:36,980 Lots and lots of people want to flip E. How many people 1307 01:12:36,980 --> 01:12:39,850 want to flip the S? 1308 01:12:39,850 --> 01:12:41,510 Hm. 1309 01:12:41,510 --> 01:12:43,960 Couple of people want to flip the S. How many people want to 1310 01:12:43,960 --> 01:12:45,210 flip the 7? 1311 01:12:47,510 --> 01:12:49,120 A bunch of people want to flip the 7. 1312 01:12:49,120 --> 01:12:52,660 How many people want to flip the 4? 1313 01:12:52,660 --> 01:12:57,770 About the same bunch of people want to flip the 4. 1314 01:12:57,770 --> 01:12:59,320 And some of the same -- 1315 01:12:59,320 --> 01:13:02,090 I probably, well, it's too complicated to collect all the 1316 01:13:02,090 --> 01:13:07,230 details of how many cards people think you need to flip. 1317 01:13:07,230 --> 01:13:09,160 The answer is, you need to flip two and 1318 01:13:09,160 --> 01:13:12,010 only two cards here. 1319 01:13:12,010 --> 01:13:15,250 And before I tell you which ones it is, I should say that 1320 01:13:15,250 --> 01:13:19,780 like many of the demos in this lecture, this is the sort of 1321 01:13:19,780 --> 01:13:23,750 thing that drives MIT students nuts. 1322 01:13:23,750 --> 01:13:26,380 Because they -- it's a cognition lecture, it's a 1323 01:13:26,380 --> 01:13:27,125 logic lecture. 1324 01:13:27,125 --> 01:13:28,660 It's got, like, mathy things in it. 1325 01:13:28,660 --> 01:13:29,550 And it's logic. 1326 01:13:29,550 --> 01:13:30,810 And that's why I'm here. 1327 01:13:30,810 --> 01:13:33,530 And I'm getting them all weird and wrong, and, and, and, and, 1328 01:13:33,530 --> 01:13:35,730 and I'm going to flunk all my courses. 1329 01:13:35,730 --> 01:13:38,140 And I'm going to die, or something. 1330 01:13:38,140 --> 01:13:41,970 I mean. 1331 01:13:41,970 --> 01:13:45,210 I should note that one of the characteristics of the Wason 1332 01:13:45,210 --> 01:13:48,010 selection task is that everybody gets it wrong. 1333 01:13:48,010 --> 01:13:50,810 I don't remember if it's Wason in originally who did it. 1334 01:13:50,810 --> 01:13:55,280 But he tried this out on logicians and they blow it. 1335 01:13:55,280 --> 01:13:59,200 And if you're feeling, like, depressed after this, you take 1336 01:13:59,200 --> 01:14:07,430 this to the calculus TA next hour, and see 1337 01:14:07,430 --> 01:14:08,480 if she can do it. 1338 01:14:08,480 --> 01:14:13,880 So, as people correctly figure, you've got 1339 01:14:13,880 --> 01:14:16,300 to pull this guy. 1340 01:14:16,300 --> 01:14:19,280 Because if it's got an even number on the other side 1341 01:14:19,280 --> 01:14:20,380 you're dead. 1342 01:14:20,380 --> 01:14:21,690 And you don't have to do anything because 1343 01:14:21,690 --> 01:14:22,540 we don't know -- 1344 01:14:22,540 --> 01:14:27,220 there's no assertion about what non-vowels are. 1345 01:14:27,220 --> 01:14:29,480 You don't have to flip him. 1346 01:14:29,480 --> 01:14:32,570 Because it doesn't say, it says -- 1347 01:14:32,570 --> 01:14:34,333 here, let's do this in good logic terms. 1348 01:14:34,333 --> 01:14:37,840 Which for mysterious reasons always talks about P. The 1349 01:14:37,840 --> 01:14:46,080 assertion is, if P, then Q. Then odd. 1350 01:14:46,080 --> 01:14:50,010 If P then Q. If you know Q, who cares. 1351 01:14:50,010 --> 01:14:54,330 It doesn't say, if Q then P. So this could be an S without 1352 01:14:54,330 --> 01:14:56,910 violating the rule. 1353 01:14:56,910 --> 01:15:01,270 This guy, on the other hand, is not Q. This guy's not P. 1354 01:15:01,270 --> 01:15:04,290 Everybody figured out we don't care about not-P. Not-Q is the 1355 01:15:04,290 --> 01:15:08,390 other one that you do need to check. 1356 01:15:08,390 --> 01:15:12,290 Because if this has an E on the other side, the 1357 01:15:12,290 --> 01:15:15,320 statement is false. 1358 01:15:15,320 --> 01:15:16,120 The rule is false. 1359 01:15:16,120 --> 01:15:20,040 So you've got to check for an E on the other side of this. 1360 01:15:20,040 --> 01:15:24,212 So, there are now three people here who say, I've got it, 1361 01:15:24,212 --> 01:15:24,400 I've got it. 1362 01:15:24,400 --> 01:15:26,540 And good for you. 1363 01:15:26,540 --> 01:15:27,080 That's nice. 1364 01:15:27,080 --> 01:15:31,150 People routinely do very badly on this. 1365 01:15:31,150 --> 01:15:37,610 And the interesting question is, is why -- well, actually, 1366 01:15:37,610 --> 01:15:39,720 I'm not sure that's an interesting question. 1367 01:15:39,720 --> 01:15:40,710 Why you do that. 1368 01:15:40,710 --> 01:15:48,970 The interesting fact is that people do very well on other 1369 01:15:48,970 --> 01:15:52,260 versions of exactly this problem. 1370 01:15:52,260 --> 01:15:59,490 So let's set up another version of this problem. 1371 01:15:59,490 --> 01:16:05,510 We've got somebody with a -- 1372 01:16:05,510 --> 01:16:06,760 a beer. 1373 01:16:09,120 --> 01:16:13,520 We've got somebody with a soda. 1374 01:16:13,520 --> 01:16:17,490 We've got an 18-year-old. 1375 01:16:17,490 --> 01:16:20,120 And we've got a 22-year-old. 1376 01:16:20,120 --> 01:16:28,530 The rule is, you have to be 22 to drink. 1377 01:16:28,530 --> 01:16:29,420 Well, 21. 1378 01:16:29,420 --> 01:16:33,100 You have to be over 21 to drink. 1379 01:16:33,100 --> 01:16:36,500 Which of these cards do we need to check? 1380 01:16:36,500 --> 01:16:40,170 So do we need to check this one? 1381 01:16:40,170 --> 01:16:41,260 Yeah, sure, right. 1382 01:16:41,260 --> 01:16:45,240 If this sucker's an 18-year-old, he's going down. 1383 01:16:45,240 --> 01:16:47,900 Do we need to check this one? 1384 01:16:47,900 --> 01:16:49,390 We don't care about that. 1385 01:16:49,390 --> 01:16:51,590 Do we need to check this one? 1386 01:16:51,590 --> 01:16:52,110 Yeah. 1387 01:16:52,110 --> 01:16:53,920 We need to check this one? 1388 01:16:53,920 --> 01:16:55,990 No, we don't care about that. 1389 01:16:55,990 --> 01:16:59,610 People are perfect at this, essentially. 1390 01:16:59,610 --> 01:17:02,010 Sometimes if I phrase it right you can manage to get yourself 1391 01:17:02,010 --> 01:17:02,650 a little confused. 1392 01:17:02,650 --> 01:17:05,360 So if you got it wrong don't sit there and say, I'll never 1393 01:17:05,360 --> 01:17:06,610 drink again. 1394 01:17:13,830 --> 01:17:17,460 So, people are perfectly good at this sort 1395 01:17:17,460 --> 01:17:20,970 of real world example. 1396 01:17:20,970 --> 01:17:22,740 It turns out that they're not good at 1397 01:17:22,740 --> 01:17:24,290 all real world examples. 1398 01:17:24,290 --> 01:17:29,050 I could, I won't bother trying to generate another real world 1399 01:17:29,050 --> 01:17:30,720 example here. 1400 01:17:30,720 --> 01:17:35,700 Leda Cosmides has argued that what people are good about 1401 01:17:35,700 --> 01:17:38,850 reasoning about is cheating. 1402 01:17:38,850 --> 01:17:42,510 Is somebody getting goodies that they're not supposed to 1403 01:17:42,510 --> 01:17:48,810 Get Is this 18-year-old getting a drink when I can't 1404 01:17:48,810 --> 01:17:50,950 get a drink yet? 1405 01:17:50,950 --> 01:17:53,930 I'm going to keep good track of him. 1406 01:17:53,930 --> 01:17:59,250 And the guy drinking this beer looks kind of young to me. 1407 01:17:59,250 --> 01:18:07,160 Whereas, that four -- yeah, nobody cares about that. 1408 01:18:07,160 --> 01:18:09,700 And if you reframe the problem, there are plenty of 1409 01:18:09,700 --> 01:18:13,360 ways to reframe this, to put things like, oh, there's a 1410 01:18:13,360 --> 01:18:16,950 version about, if you're in Seattle it must be raining. 1411 01:18:16,950 --> 01:18:18,220 Examples like that. 1412 01:18:18,220 --> 01:18:20,370 People bomb that all over the place. 1413 01:18:20,370 --> 01:18:23,850 There's nothing at stake there. 1414 01:18:23,850 --> 01:18:26,880 Cosmides and a variety of the evolutionary psych people 1415 01:18:26,880 --> 01:18:33,390 argue that your cognitive capabilities have been shaped 1416 01:18:33,390 --> 01:18:35,890 by the problems that you need to solve out 1417 01:18:35,890 --> 01:18:36,950 there in the world. 1418 01:18:36,950 --> 01:18:39,640 Is somebody getting goodies they shouldn't be getting? 1419 01:18:39,640 --> 01:18:41,110 Is the guy with the stick going to 1420 01:18:41,110 --> 01:18:42,430 hit me with the stick? 1421 01:18:42,430 --> 01:18:43,350 That kind of thing. 1422 01:18:43,350 --> 01:18:45,770 It would be lovely if that was true. 1423 01:18:45,770 --> 01:18:46,610 It might be true. 1424 01:18:46,610 --> 01:18:50,820 The problem is that the research subsequent to this 1425 01:18:50,820 --> 01:18:52,900 has made the issue more complicated. 1426 01:18:52,900 --> 01:18:57,690 It is not -- there are cheating examples that people 1427 01:18:57,690 --> 01:19:01,350 have cooked up that bomb. 1428 01:19:01,350 --> 01:19:03,730 And there are non-cheating examples that people 1429 01:19:03,730 --> 01:19:05,620 successfully manage to solve. 1430 01:19:05,620 --> 01:19:08,610 But what does seem to be pretty clear is that we are 1431 01:19:08,610 --> 01:19:12,110 not built to solve abstract problems. 1432 01:19:12,110 --> 01:19:16,250 That's why you have to show up at MIT to learn 1433 01:19:16,250 --> 01:19:18,290 that kind of stuff. 1434 01:19:18,290 --> 01:19:21,660 Nobody goes to MIT to take courses in 1435 01:19:21,660 --> 01:19:25,580 carding guys at bars. 1436 01:19:25,580 --> 01:19:26,890 You come with that. 1437 01:19:26,890 --> 01:19:29,610 You can figure that out for yourself. 1438 01:19:29,610 --> 01:19:32,710 It's when you have -- unfortunately. 1439 01:19:32,710 --> 01:19:36,510 The problem of running the country turns out to be 1440 01:19:36,510 --> 01:19:40,610 probably a lot more like this than like this. 1441 01:19:40,610 --> 01:19:43,620 But the decision is made like this. 1442 01:19:43,620 --> 01:19:45,410 So, everybody go watch the debate and 1443 01:19:45,410 --> 01:19:47,230 tell me what you find.