1 00:00:00,090 --> 00:00:02,490 The following content is provided under a Creative 2 00:00:02,490 --> 00:00:04,030 Commons license. 3 00:00:04,030 --> 00:00:06,330 Your support will help MIT OpenCourseWare 4 00:00:06,330 --> 00:00:10,720 continue to offer high quality educational resources for free. 5 00:00:10,720 --> 00:00:13,320 To make a donation or view additional materials 6 00:00:13,320 --> 00:00:17,280 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:17,280 --> 00:00:20,130 at ocw.mit.edu. 8 00:00:20,130 --> 00:00:23,070 ABBY NOYCE: First, a little bit about how 9 00:00:23,070 --> 00:00:25,650 we make decision making. 10 00:00:25,650 --> 00:00:29,310 So, one model of how we make decisions 11 00:00:29,310 --> 00:00:32,369 is what's called the expected utility model. 12 00:00:32,369 --> 00:00:34,230 And the expected utility model basically 13 00:00:34,230 --> 00:00:36,960 says that you look at-- 14 00:00:36,960 --> 00:00:39,420 if you have two possibilities, you 15 00:00:39,420 --> 00:00:43,620 look at what you think the likely outcomes of each 16 00:00:43,620 --> 00:00:47,010 of these possibilities are, how probable those outcomes are, 17 00:00:47,010 --> 00:00:50,195 and how you value those outcomes. 18 00:00:50,195 --> 00:00:53,430 So, if you think that if I make choice A, 19 00:00:53,430 --> 00:00:57,210 then I'm very likely to have a good thing happen but there's 20 00:00:57,210 --> 00:01:00,410 a small possibility of having something really bad happen, 21 00:01:00,410 --> 00:01:02,850 or choice B is kind of neutral, then 22 00:01:02,850 --> 00:01:05,820 depending on just how you value these different outcomes, 23 00:01:05,820 --> 00:01:07,538 you're going to make different choices. 24 00:01:07,538 --> 00:01:09,680 AUDIENCE: Like expected value? 25 00:01:09,680 --> 00:01:12,820 ABBY NOYCE: It's all about-- yeah-- expected value. 26 00:01:12,820 --> 00:01:14,850 So, you can think of this as like-- 27 00:01:17,730 --> 00:01:19,980 if you're trying to make decisions like what to do-- 28 00:01:19,980 --> 00:01:21,540 It's Friday night. 29 00:01:21,540 --> 00:01:23,601 You've got an exam on Monday. 30 00:01:23,601 --> 00:01:24,975 What are the things you could do? 31 00:01:24,975 --> 00:01:27,495 So, you've got a decision. 32 00:01:30,030 --> 00:01:33,590 So, plan A is stay home and study, right? 33 00:01:33,590 --> 00:01:36,630 You could stay home. 34 00:01:36,630 --> 00:01:41,220 You could be a diligent student. 35 00:01:41,220 --> 00:01:44,970 And let's suppose that if you stay home and study, 36 00:01:44,970 --> 00:01:47,610 then it's almost certain that you'll do well on the exam. 37 00:01:47,610 --> 00:01:50,042 AUDIENCE: There's a 1% chance you'll waste your whole time 38 00:01:50,042 --> 00:01:51,630 playing computer games. 39 00:01:51,630 --> 00:01:53,921 ABBY NOYCE: Right, let's keep this a little bit simpler 40 00:01:53,921 --> 00:01:54,810 for a minute. 41 00:01:54,810 --> 00:01:57,250 Your other option is to do something else. 42 00:01:57,250 --> 00:01:58,712 What might you do on a Friday night 43 00:01:58,712 --> 00:02:00,420 if you don't want to stay home and study? 44 00:02:00,420 --> 00:02:01,477 AUDIENCE: Go to a party. 45 00:02:01,477 --> 00:02:02,560 ABBY NOYCE: Go to a party. 46 00:02:09,660 --> 00:02:11,930 Go to a party. 47 00:02:11,930 --> 00:02:14,611 What are some possible outcomes of going to a party? 48 00:02:14,611 --> 00:02:17,110 Good things that might happen, bad things that might happen. 49 00:02:17,110 --> 00:02:17,970 AUDIENCE: You meet new people. 50 00:02:17,970 --> 00:02:18,720 AUDIENCE: You might have fun. 51 00:02:18,720 --> 00:02:20,011 ABBY NOYCE: You might have fun. 52 00:02:26,790 --> 00:02:30,570 Or it might be a party where you don't know anybody and nobody 53 00:02:30,570 --> 00:02:32,200 seems to want to talk to you. 54 00:02:32,200 --> 00:02:34,930 It might be a party where you have no fun. 55 00:02:34,930 --> 00:02:37,397 So, what do you think-- 56 00:02:37,397 --> 00:02:39,480 give me a relative likelihood of these two things. 57 00:02:39,480 --> 00:02:41,010 Are these equally likely outcomes? 58 00:02:41,010 --> 00:02:42,858 Is one more likely than the other? 59 00:02:42,858 --> 00:02:44,610 AUDIENCE: Well, it depends whose party. 60 00:02:44,610 --> 00:02:45,570 ABBY NOYCE: Depends whose party. 61 00:02:45,570 --> 00:02:46,340 Think of a party. 62 00:02:46,340 --> 00:02:49,830 Pick one at random, some kid you know from school 63 00:02:49,830 --> 00:02:51,220 but you don't know really well. 64 00:02:51,220 --> 00:02:53,908 But they're having a party and you heard about it. 65 00:02:53,908 --> 00:02:55,390 AUDIENCE: Were you invited? 66 00:02:55,390 --> 00:02:56,920 ABBY NOYCE: Were you invited? 67 00:02:56,920 --> 00:02:57,848 I don't know. 68 00:02:57,848 --> 00:03:01,440 AUDIENCE: We just go to the party anyway. 69 00:03:01,440 --> 00:03:02,940 ABBY NOYCE: I've done that. 70 00:03:02,940 --> 00:03:03,720 So, all right. 71 00:03:03,720 --> 00:03:06,990 Let's say that there's a 75% chance 72 00:03:06,990 --> 00:03:09,570 that you'll have fun and a 25% chance 73 00:03:09,570 --> 00:03:10,740 that you'll be miserable. 74 00:03:10,740 --> 00:03:12,930 And we'll say that this is a certainty, that there's 75 00:03:12,930 --> 00:03:15,030 a 100% chance that you'll do well on your exam 76 00:03:15,030 --> 00:03:18,210 if you stay home and study. 77 00:03:18,210 --> 00:03:21,570 How good or bad are these outcomes 78 00:03:21,570 --> 00:03:25,950 on a scale from like minus 10 to 10, where 0 is kind of neutral 79 00:03:25,950 --> 00:03:28,500 and minus 10 is the worst thing that could possibly happen 80 00:03:28,500 --> 00:03:30,600 to you and 10 is amazing? 81 00:03:30,600 --> 00:03:34,320 Where is having fun at a party on this value scale? 82 00:03:34,320 --> 00:03:35,430 Is it a 2? 83 00:03:35,430 --> 00:03:37,110 Is it an 8? 84 00:03:37,110 --> 00:03:38,590 Is it a minus 3? 85 00:03:38,590 --> 00:03:39,090 AUDIENCE: 6 86 00:03:39,090 --> 00:03:39,690 ABBY NOYCE: Probably not. 87 00:03:39,690 --> 00:03:40,320 A 6? 88 00:03:40,320 --> 00:03:41,600 Sure. 89 00:03:41,600 --> 00:03:44,360 AUDIENCE: 7.5. 90 00:03:44,360 --> 00:03:46,970 ABBY NOYCE: How bad is it to go to a party and be miserable? 91 00:03:46,970 --> 00:03:48,230 AUDIENCE: 0. 92 00:03:48,230 --> 00:03:49,740 ABBY NOYCE: It's just neutral? 93 00:03:49,740 --> 00:03:50,400 It's not bad? 94 00:03:50,400 --> 00:03:52,470 AUDIENCE: That's a negative. 95 00:03:52,470 --> 00:03:53,560 AUDIENCE: Minus 5. 96 00:03:53,560 --> 00:03:54,070 ABBY NOYCE: Give me a negative. 97 00:03:54,070 --> 00:03:55,200 Someone give me a negative number. 98 00:03:55,200 --> 00:03:55,700 Minus-- 99 00:03:55,700 --> 00:03:56,750 AUDIENCE: Negative 7. 100 00:03:56,750 --> 00:03:59,450 ABBY NOYCE: Negative 7? 101 00:03:59,450 --> 00:04:00,920 All right. 102 00:04:00,920 --> 00:04:02,869 How much of a good-- 103 00:04:02,869 --> 00:04:04,910 how important is it to you to do well on an exam? 104 00:04:04,910 --> 00:04:06,451 AUDIENCE: 20. 105 00:04:06,451 --> 00:04:07,950 ABBY NOYCE: Scale of minus 10 to 10. 106 00:04:07,950 --> 00:04:08,780 AUDIENCE: 10. 107 00:04:08,780 --> 00:04:10,720 ABBY NOYCE: It's a 10? 108 00:04:10,720 --> 00:04:13,834 This is super important? 109 00:04:13,834 --> 00:04:16,250 So, we probably want something that makes it more valuable 110 00:04:16,250 --> 00:04:18,670 than having fun at a party, maybe. 111 00:04:18,670 --> 00:04:19,630 Let's call it an 8. 112 00:04:23,470 --> 00:04:24,800 OK. 113 00:04:24,800 --> 00:04:30,590 So, expected utility model would look at this and say, 114 00:04:30,590 --> 00:04:35,690 this option tree has a 100% chance of getting a value of 8. 115 00:04:35,690 --> 00:04:39,920 So, the value of this branch of my decision tree is 8. 116 00:04:43,616 --> 00:04:49,330 The expected value is 8. 117 00:04:49,330 --> 00:04:52,570 The value of this branch of my decision tree 118 00:04:52,570 --> 00:04:58,471 is going to be 3/4 but-- 119 00:04:58,471 --> 00:05:00,220 remember we assign probabilities to these. 120 00:05:00,220 --> 00:05:02,230 This outcome is a lot more likely. 121 00:05:02,230 --> 00:05:14,427 So, it would be 6 plus a quarter of minus 7, which is-- 122 00:05:14,427 --> 00:05:17,896 AUDIENCE: That's negative 7. 123 00:05:17,896 --> 00:05:19,020 ABBY NOYCE: So, what is it? 124 00:05:19,020 --> 00:05:25,710 It's 3/4 of 6, which is 4 and 1/2, plus a quarter of minus 7, 125 00:05:25,710 --> 00:05:30,150 which is minus-- so, this is 4 and 1/2. 126 00:05:30,150 --> 00:05:35,010 This is minus 1.75, right? 127 00:05:35,010 --> 00:05:39,446 So 4 and 1/2, 3 and 1/2, 2 and-- 128 00:05:39,446 --> 00:05:40,136 AUDIENCE: .75 129 00:05:40,136 --> 00:05:41,052 ABBY NOYCE: Thank you. 130 00:05:41,052 --> 00:05:41,860 3/4, yes. 131 00:05:44,440 --> 00:05:50,260 So, according to an expected utility model, 132 00:05:50,260 --> 00:05:53,950 if this is how you weight all of these things-- 133 00:05:53,950 --> 00:05:56,650 the goodness here, the badness over here 134 00:05:56,650 --> 00:05:59,320 and the probabilities of each of these-- 135 00:05:59,320 --> 00:06:03,370 then which of these is going to be the more rational outcome 136 00:06:03,370 --> 00:06:05,282 to pick? 137 00:06:05,282 --> 00:06:06,250 AUDIENCE: Study. 138 00:06:06,250 --> 00:06:07,960 ABBY NOYCE: Stay home and study. 139 00:06:07,960 --> 00:06:12,330 So, on the other hand, the fact that people probably-- 140 00:06:12,330 --> 00:06:15,700 that may not be the outcome that everybody would 141 00:06:15,700 --> 00:06:17,890 choose in this situation-- 142 00:06:17,890 --> 00:06:22,000 probably shows that people either assess 143 00:06:22,000 --> 00:06:24,250 these probabilities different or weight the outcomes 144 00:06:24,250 --> 00:06:25,160 differently. 145 00:06:25,160 --> 00:06:28,372 But this is a model of how you can expect people to behave. 146 00:06:28,372 --> 00:06:30,800 AUDIENCE: I think that we forgot to take into account the 147 00:06:30,800 --> 00:06:33,310 being bored while we study. 148 00:06:33,310 --> 00:06:36,910 ABBY NOYCE: So, you think that studying should be-- 149 00:06:36,910 --> 00:06:37,870 stay home and study-- 150 00:06:37,870 --> 00:06:39,411 AUDIENCE: It has a short-term outcome 151 00:06:39,411 --> 00:06:40,640 and a long-term outcome. 152 00:06:40,640 --> 00:06:43,900 ABBY NOYCE: So, the total outcome should be maybe lower. 153 00:06:43,900 --> 00:06:46,173 You think our assigned values are off. 154 00:06:46,173 --> 00:06:46,756 AUDIENCE: Yes. 155 00:06:46,756 --> 00:06:50,095 AUDIENCE: What are we considering "well" on the exam? 156 00:06:50,095 --> 00:06:53,180 ABBY NOYCE: Up to you. 157 00:06:53,180 --> 00:06:56,130 AUDIENCE: 105%. 158 00:06:56,130 --> 00:06:57,400 ABBY NOYCE: Sure. 159 00:06:57,400 --> 00:06:59,520 Anyway, this is a model. 160 00:06:59,520 --> 00:07:01,710 So, in cases where you think that people 161 00:07:01,710 --> 00:07:03,719 are going to make rational decisions, 162 00:07:03,719 --> 00:07:05,010 this is a model that gets used. 163 00:07:05,010 --> 00:07:07,590 So, like, economists think about decision-making 164 00:07:07,590 --> 00:07:10,110 in terms of maximizing your expected outcome 165 00:07:10,110 --> 00:07:12,850 and things like that. 166 00:07:12,850 --> 00:07:15,750 So, the thing is that decisions that are actually 167 00:07:15,750 --> 00:07:18,540 made by real live human beings, who are not 168 00:07:18,540 --> 00:07:20,790 perfectly rational creatures, as you've probably 169 00:07:20,790 --> 00:07:22,560 noticed at some point-- 170 00:07:22,560 --> 00:07:24,960 people make decisions that are different. 171 00:07:24,960 --> 00:07:29,942 So, here's an example, kind of a hypothetical question 172 00:07:29,942 --> 00:07:31,650 that is a good example of how this works. 173 00:07:31,650 --> 00:07:35,812 Suppose the nation is preparing for the outbreak of a disease. 174 00:07:35,812 --> 00:07:37,520 We know that there is an epidemic coming. 175 00:07:37,520 --> 00:07:40,259 And it's expected to kill 200 people. 176 00:07:40,259 --> 00:07:42,300 And the government and the national health people 177 00:07:42,300 --> 00:07:47,400 and the CDC are like, OK, there are two options that we could-- 178 00:07:47,400 --> 00:07:49,440 kind of like 24-- there are two options 179 00:07:49,440 --> 00:07:51,080 that we could do to deal with this. 180 00:07:51,080 --> 00:07:57,450 In one option, we are certain to save 200 of these 600 people. 181 00:07:57,450 --> 00:08:00,780 And in the other option, there's a one third probability 182 00:08:00,780 --> 00:08:03,780 that we'll save all of them, but there's a 2/3 probability 183 00:08:03,780 --> 00:08:07,200 that all 600 people will die from this disease. 184 00:08:07,200 --> 00:08:09,616 Which of these options would you support? 185 00:08:09,616 --> 00:08:11,035 AUDIENCE: The first one. 186 00:08:11,035 --> 00:08:14,832 [INTERPOSING VOICES] 187 00:08:14,832 --> 00:08:17,040 ABBY NOYCE: You may not even be one of the 600 people 188 00:08:17,040 --> 00:08:18,670 who's expected to die in this. 189 00:08:18,670 --> 00:08:21,990 You may be safely ensconced in your, I don't know, 190 00:08:21,990 --> 00:08:23,940 your apocalyptic little hideaway. 191 00:08:23,940 --> 00:08:27,090 But who thinks that option one looks like a more appealing 192 00:08:27,090 --> 00:08:30,260 option here? 193 00:08:30,260 --> 00:08:31,750 AUDIENCE: I'd run away. 194 00:08:31,750 --> 00:08:33,375 ABBY NOYCE: Who thinks option two looks 195 00:08:33,375 --> 00:08:35,220 like a more appealing option? 196 00:08:35,220 --> 00:08:37,350 You are forced to choose. 197 00:08:37,350 --> 00:08:38,549 Pick one or the other. 198 00:08:38,549 --> 00:08:39,080 Option one? 199 00:08:44,450 --> 00:08:45,520 Option one. 200 00:08:45,520 --> 00:08:48,010 Put your hands up where I can see them so I can count you. 201 00:08:48,010 --> 00:08:49,270 Make a decision. 202 00:08:49,270 --> 00:08:51,490 One, two, three, four, five, six. 203 00:08:51,490 --> 00:08:53,860 Option two? 204 00:08:53,860 --> 00:08:55,350 And three. 205 00:08:55,350 --> 00:08:59,170 Now, look at our expected utility model. 206 00:08:59,170 --> 00:09:01,450 These come out the same if you count. 207 00:09:01,450 --> 00:09:06,790 So, OK, one third of 600, if you multiply 208 00:09:06,790 --> 00:09:08,990 the probability times the number, is 200 people. 209 00:09:08,990 --> 00:09:12,190 So, according to this model, the expected utility of these 210 00:09:12,190 --> 00:09:14,380 is the same. 211 00:09:14,380 --> 00:09:17,620 And yet, a majority of people will like option one better. 212 00:09:17,620 --> 00:09:18,820 Why? 213 00:09:18,820 --> 00:09:21,220 Well, let's get to that. 214 00:09:21,220 --> 00:09:23,050 Here's another scenario, same idea. 215 00:09:23,050 --> 00:09:25,000 We're preparing for the outbreak of a disease. 216 00:09:25,000 --> 00:09:28,260 There's been two options presented. 217 00:09:28,260 --> 00:09:32,040 The first option will result in 400 of these people 218 00:09:32,040 --> 00:09:34,180 dying, no matter what we do. 219 00:09:34,180 --> 00:09:37,570 The second option-- well, there's 220 00:09:37,570 --> 00:09:40,300 a 33% chance that nobody will die at all, 221 00:09:40,300 --> 00:09:41,930 that everyone will be saved. 222 00:09:41,930 --> 00:09:45,760 And a 2/3 probability that 600 people are going to die. 223 00:09:45,760 --> 00:09:47,512 Which of these options do you like better? 224 00:09:47,512 --> 00:09:48,810 AUDIENCE: Isn't that the same thing? 225 00:09:48,810 --> 00:09:50,476 ABBY NOYCE: Who likes option one better? 226 00:09:50,476 --> 00:09:51,760 Raise your hands. 227 00:09:51,760 --> 00:09:53,080 Who likes option two better? 228 00:09:53,080 --> 00:09:55,060 Raise your hands. 229 00:09:55,060 --> 00:09:56,759 So, that's it. 230 00:09:56,759 --> 00:09:58,550 All right, so by seeing them both together, 231 00:09:58,550 --> 00:10:01,611 you guys notice that these are exactly the same scenario, 232 00:10:01,611 --> 00:10:02,110 right? 233 00:10:02,110 --> 00:10:03,970 The numbers are identical. 234 00:10:03,970 --> 00:10:05,900 But if you present them to people separately, 235 00:10:05,900 --> 00:10:07,960 what we'll find is that in the first case, when 236 00:10:07,960 --> 00:10:11,800 this is phrased as how many people are going to be saved, 237 00:10:11,800 --> 00:10:13,900 people like certainty. 238 00:10:13,900 --> 00:10:16,220 People are risk-averse. 239 00:10:16,220 --> 00:10:19,810 They don't want to pick a risky situation when 240 00:10:19,810 --> 00:10:21,970 the question is couched in terms of how much we're 241 00:10:21,970 --> 00:10:23,880 going to gain. 242 00:10:23,880 --> 00:10:25,630 When the question is couched in terms 243 00:10:25,630 --> 00:10:28,510 of how much you might lose, when we're looking at deaths, 244 00:10:28,510 --> 00:10:31,600 people start being more attracted 245 00:10:31,600 --> 00:10:33,370 to the riskier possibility. 246 00:10:33,370 --> 00:10:36,400 When you have a chance of something bad happening, 247 00:10:36,400 --> 00:10:38,900 but a chance that it might not, people will go for risk. 248 00:10:38,900 --> 00:10:40,900 Whereas if we phrase it as a chance of something 249 00:10:40,900 --> 00:10:43,060 good happening or a chance that it might not, 250 00:10:43,060 --> 00:10:45,310 people will go for certainty. 251 00:10:45,310 --> 00:10:49,470 People like certainty for good things and risk for bad things. 252 00:10:49,470 --> 00:10:51,332 We're risk-averse when things are 253 00:10:51,332 --> 00:10:52,540 discussed in terms of losses. 254 00:10:56,710 --> 00:10:59,560 Does this make sense? 255 00:10:59,560 --> 00:11:01,840 Think of it as, like, gambling. 256 00:11:01,840 --> 00:11:06,460 If I tell you, you can either win $50 for sure 257 00:11:06,460 --> 00:11:10,165 or you have a 75% chance of winning 258 00:11:10,165 --> 00:11:14,514 $150 and a 25% chance of winning nothing, which of these 259 00:11:14,514 --> 00:11:15,430 are you going to pick? 260 00:11:15,430 --> 00:11:18,972 50 certain dollars or a gamble on getting more than that? 261 00:11:18,972 --> 00:11:20,280 AUDIENCE: More. 262 00:11:20,280 --> 00:11:22,860 ABBY NOYCE: Who would pick 50 certain dollars? 263 00:11:22,860 --> 00:11:24,990 Who would pick the gamble? 264 00:11:29,059 --> 00:11:31,780 AUDIENCE: A 75% chance of what? 265 00:11:31,780 --> 00:11:34,817 ABBY NOYCE: $150. 266 00:11:34,817 --> 00:11:37,150 Expected utility model says that these are equal, right? 267 00:11:40,227 --> 00:11:42,060 But when it's couched as things you can get, 268 00:11:42,060 --> 00:11:43,920 people tend to be-- 269 00:11:43,920 --> 00:11:46,420 and there's going to be variations in this for amounts 270 00:11:46,420 --> 00:11:48,669 and for personality-- people tend to be more attracted 271 00:11:48,669 --> 00:11:49,730 to the certain thing. 272 00:11:49,730 --> 00:11:52,063 If I spin that the other way around and I said, well you 273 00:11:52,063 --> 00:11:54,984 could either lose $50 for certain 274 00:11:54,984 --> 00:11:56,650 or you could take a gamble where there's 275 00:11:56,650 --> 00:12:02,224 a 25% chance of losing nothing or a 75% chance of losing $150, 276 00:12:02,224 --> 00:12:03,640 which one are you going to prefer? 277 00:12:03,640 --> 00:12:04,681 AUDIENCE: The second one. 278 00:12:04,681 --> 00:12:05,650 ABBY NOYCE: The gamble. 279 00:12:05,650 --> 00:12:07,210 Who thinks they prefer the gamble? 280 00:12:07,210 --> 00:12:10,540 Who thinks they prefer to lose $50 for certain? 281 00:12:10,540 --> 00:12:14,860 Again, the expected utility math says that they're the same. 282 00:12:14,860 --> 00:12:16,633 But people like the risk. 283 00:12:16,633 --> 00:12:19,531 AUDIENCE: Because there's a difference between losing $50 284 00:12:19,531 --> 00:12:21,463 and not gaining $50. 285 00:12:26,250 --> 00:12:27,810 Losing $50 means you lose money. 286 00:12:27,810 --> 00:12:31,280 Not gaining $50 means you just don't-- nothing happens. 287 00:12:31,280 --> 00:12:34,770 ABBY NOYCE: That's part of it. 288 00:12:34,770 --> 00:12:36,310 So, somebody put together something 289 00:12:36,310 --> 00:12:39,040 that's similar with a study, that they said, so, 290 00:12:39,040 --> 00:12:41,450 imagine you're at a casino and you either-- 291 00:12:41,450 --> 00:12:43,480 and they set it up as two possible scenarios. 292 00:12:43,480 --> 00:12:46,890 You win $200 and you have a chance 293 00:12:46,890 --> 00:12:49,419 of gaining some more beyond that, or you win $400 294 00:12:49,419 --> 00:12:51,460 and you have a chance of losing some beyond that, 295 00:12:51,460 --> 00:12:53,376 so that all the scenarios kind of average out. 296 00:12:53,376 --> 00:12:55,547 So, you ended up with $250 or something. 297 00:12:55,547 --> 00:12:57,130 And I don't remember the exact numbers 298 00:12:57,130 --> 00:12:58,300 and I could probably work it out but it 299 00:12:58,300 --> 00:12:59,740 would take me 10 minutes. 300 00:12:59,740 --> 00:13:03,700 And what they found is that even when your net outcome would 301 00:13:03,700 --> 00:13:06,850 have been the same across either the win $200 and then win some 302 00:13:06,850 --> 00:13:11,710 more versus win $400 and then lose some, people are, again, 303 00:13:11,710 --> 00:13:13,210 more attracted-- they're-- 304 00:13:13,210 --> 00:13:16,960 a certain thing, when they have a chance of gaining something, 305 00:13:16,960 --> 00:13:19,660 they'd rather take a smaller, certain gain, 306 00:13:19,660 --> 00:13:22,570 than a riskier, larger gain. 307 00:13:22,570 --> 00:13:25,240 And the opposite is true when they stand to lose something. 308 00:13:25,240 --> 00:13:29,790 They'd rather take a riskier possible larger loss 309 00:13:29,790 --> 00:13:33,340 but possible not losing anything, versus a smaller 310 00:13:33,340 --> 00:13:34,270 certain loss. 311 00:13:34,270 --> 00:13:35,770 So, even when the amounts are-- 312 00:13:35,770 --> 00:13:39,970 so, our perception of the change is different. 313 00:13:39,970 --> 00:13:45,540 It's like we feel like gaining $50 is less bad than losing 314 00:13:45,540 --> 00:13:49,330 $50, that one of these is a bigger change than the other. 315 00:13:52,578 --> 00:13:54,246 AUDIENCE: Is it like the [INAUDIBLE] 316 00:13:54,246 --> 00:13:56,454 when you have nothing to lose, like kind of go for it 317 00:13:56,454 --> 00:13:58,521 type thing? 318 00:13:58,521 --> 00:14:00,270 ABBY NOYCE: It's probably related to that, 319 00:14:00,270 --> 00:14:07,650 although I'd say that that might more come from this risk 320 00:14:07,650 --> 00:14:09,540 attraction to risky-- 321 00:14:09,540 --> 00:14:11,056 when we're losing something. 322 00:14:11,056 --> 00:14:12,930 So, you're like, eh, what am I going to lose? 323 00:14:17,520 --> 00:14:19,130 Actually I suspect that it's more like 324 00:14:19,130 --> 00:14:20,880 if you feel like you have nothing to lose, 325 00:14:20,880 --> 00:14:23,290 then you're not risking as much. 326 00:14:23,290 --> 00:14:25,350 So, what might seem like a risky possibility 327 00:14:25,350 --> 00:14:28,241 to someone who's got a vested interest just isn't as much 328 00:14:28,241 --> 00:14:28,740 of a risk. 329 00:14:28,740 --> 00:14:30,281 Like the bad things that could happen 330 00:14:30,281 --> 00:14:32,100 are less bad, in that case. 331 00:14:32,100 --> 00:14:33,420 But it's probably related. 332 00:14:33,420 --> 00:14:37,200 There's a lot of studies looking at how people assess risk. 333 00:14:37,200 --> 00:14:39,690 And the outcome is that in a lot of ways we're not-- 334 00:14:39,690 --> 00:14:41,856 you can set up a lot of scenarios in which we're not 335 00:14:41,856 --> 00:14:43,260 very good at it. 336 00:14:43,260 --> 00:14:44,910 Probably because the kinds of risks 337 00:14:44,910 --> 00:14:47,100 that we evolved to deal with, the risks 338 00:14:47,100 --> 00:14:49,200 you have to deal with as a primate running around 339 00:14:49,200 --> 00:14:51,780 on the Savannah trying to not get eaten by a lion, 340 00:14:51,780 --> 00:14:53,670 are very different from the kinds of risks 341 00:14:53,670 --> 00:14:57,420 you have to assess as a human being living in a city 342 00:14:57,420 --> 00:15:00,840 and dealing with cars and finances and all 343 00:15:00,840 --> 00:15:03,150 of these other things that we try and make 344 00:15:03,150 --> 00:15:04,230 sense out of every day. 345 00:15:04,230 --> 00:15:06,800 So, our intuitive math for these things 346 00:15:06,800 --> 00:15:08,550 doesn't deal well with a lot of situations 347 00:15:08,550 --> 00:15:10,550 that we find ourselves in. 348 00:15:10,550 --> 00:15:11,530 Yeah? 349 00:15:11,530 --> 00:15:15,078 AUDIENCE: If you start off with $200, 350 00:15:15,078 --> 00:15:18,894 don't people feel that, Oh, I already have $200. 351 00:15:18,894 --> 00:15:20,325 So, I can only gain. 352 00:15:20,325 --> 00:15:22,710 I can't lose. 353 00:15:22,710 --> 00:15:26,940 Like, they have something could push off and go forward. 354 00:15:26,940 --> 00:15:29,860 Because if you have $400, they're like, 355 00:15:29,860 --> 00:15:33,869 oh, I can lose more than $200 or something. 356 00:15:33,869 --> 00:15:35,410 ABBY NOYCE: Maybe, although I suspect 357 00:15:35,410 --> 00:15:37,110 you'd see the same outcomes no matter 358 00:15:37,110 --> 00:15:38,940 what your base amounts were. 359 00:15:38,940 --> 00:15:41,430 I suspect you'd see the same thing with $10 and $20 360 00:15:41,430 --> 00:15:45,816 and values ending up somewhere in the middle. 361 00:15:45,816 --> 00:15:48,310 And I suspect you'd see it-- 362 00:15:48,310 --> 00:15:53,960 I think it's a factor of how the questions are phrased, 363 00:15:53,960 --> 00:15:55,710 whether it's set up as a situation of loss 364 00:15:55,710 --> 00:16:00,010 or a situation of gain, versus where the baselines actually 365 00:16:00,010 --> 00:16:00,510 are. 366 00:16:04,880 --> 00:16:05,470 All right. 367 00:16:05,470 --> 00:16:08,010 So, humans are kind of generally bad at numbers. 368 00:16:08,010 --> 00:16:09,560 We're good at some other things too. 369 00:16:09,560 --> 00:16:11,860 So, the rules of thumb that people 370 00:16:11,860 --> 00:16:14,020 use when they're trying to decide-- 371 00:16:14,020 --> 00:16:16,690 and this is less decision making per se-- 372 00:16:16,690 --> 00:16:19,300 but when we're trying to figure out how to make 373 00:16:19,300 --> 00:16:21,850 sense out of the world. 374 00:16:21,850 --> 00:16:25,450 Who here has taken some amount of CS, done some programming? 375 00:16:25,450 --> 00:16:27,770 One-- some amount of computer science, 376 00:16:27,770 --> 00:16:30,970 some amount of programming in school or for a hobby. 377 00:16:30,970 --> 00:16:31,660 A few, OK. 378 00:16:31,660 --> 00:16:33,760 So, when you're writing software, 379 00:16:33,760 --> 00:16:35,830 you're interested in algorithms, right? 380 00:16:35,830 --> 00:16:38,380 So, an algorithm is a series of steps that will always 381 00:16:38,380 --> 00:16:41,710 get you a right answer. 382 00:16:41,710 --> 00:16:44,650 People, humans, human brains, don't seem 383 00:16:44,650 --> 00:16:45,970 to be so much on algorithms. 384 00:16:45,970 --> 00:16:47,670 We tend to take a lot of shortcuts. 385 00:16:47,670 --> 00:16:49,420 We tend to use what are called heuristics. 386 00:16:49,420 --> 00:16:51,460 And your heuristic is like a general strategy 387 00:16:51,460 --> 00:16:54,160 that will usually get you the right answer. 388 00:16:54,160 --> 00:16:55,970 It's good enough. 389 00:16:55,970 --> 00:17:00,010 But if you're interested in how decision-making works, 390 00:17:00,010 --> 00:17:03,430 how people make some kinds of judgment calls or things like 391 00:17:03,430 --> 00:17:05,725 that, then-- just like how, in vision, 392 00:17:05,725 --> 00:17:08,349 it's interesting to look at the places where your visual system 393 00:17:08,349 --> 00:17:11,470 makes mistakes, looking at things like optical illusions 394 00:17:11,470 --> 00:17:13,670 in order to understand how it works-- 395 00:17:13,670 --> 00:17:15,910 people who study decision-making look at places 396 00:17:15,910 --> 00:17:20,050 where these heuristics seem to lead us to making bad decisions 397 00:17:20,050 --> 00:17:23,740 or decisions that are wrong relative to the facts or we 398 00:17:23,740 --> 00:17:27,440 will come to a conclusion that is not actually true. 399 00:17:27,440 --> 00:17:30,430 So, I want to talk about three major heuristics 400 00:17:30,430 --> 00:17:35,560 that we seem to use in order to understand how our world works. 401 00:17:35,560 --> 00:17:39,880 First up is what's called the representative heuristic. 402 00:17:39,880 --> 00:17:44,950 So, suppose you toss a coin six times. 403 00:17:44,950 --> 00:17:49,720 Which of these potential outcomes is most likely? 404 00:17:49,720 --> 00:17:53,970 Heads, heads, heads, tails, tails, tails, or tails, heads, 405 00:17:53,970 --> 00:17:58,120 tails, heads, tails, heads, or heads, tails, tails, heads, 406 00:17:58,120 --> 00:18:04,090 tails, heads, or heads, heads, heads, heads, heads, heads? 407 00:18:04,090 --> 00:18:06,212 Which of these is most likely? 408 00:18:06,212 --> 00:18:08,180 AUDIENCE: It's the same, no matter what? 409 00:18:08,180 --> 00:18:12,115 ABBY NOYCE: It is the same no matter what. 410 00:18:12,115 --> 00:18:15,190 How many people have kind of an intuitive perception 411 00:18:15,190 --> 00:18:16,690 that, for example, the last of these 412 00:18:16,690 --> 00:18:18,940 is less likely if you looked at that? 413 00:18:18,940 --> 00:18:20,920 Who would kind of, even if you know the math, 414 00:18:20,920 --> 00:18:23,037 know better from a math perspective. 415 00:18:23,037 --> 00:18:24,370 But it feels less likely, right? 416 00:18:24,370 --> 00:18:25,995 If you flipped a coin six times and got 417 00:18:25,995 --> 00:18:28,450 all heads you'd be like, whoa. 418 00:18:28,450 --> 00:18:30,520 And yet, it's no less likely than 419 00:18:30,520 --> 00:18:33,530 any other specific pattern. 420 00:18:33,530 --> 00:18:35,440 But what happens is that because we 421 00:18:35,440 --> 00:18:38,680 know that we're taking a random sample from a pretty 422 00:18:38,680 --> 00:18:43,270 random decision space, we think that it's more likely 423 00:18:43,270 --> 00:18:47,230 if the sample is similar to that population from which it's 424 00:18:47,230 --> 00:18:48,780 selected. 425 00:18:48,780 --> 00:18:54,420 So, this is kind of the, if it looks like a duck 426 00:18:54,420 --> 00:18:56,560 and it walks like a duck and it quacks like a duck, 427 00:18:56,560 --> 00:18:59,020 it's probably a duck, heuristic. 428 00:18:59,020 --> 00:19:00,480 If the sample that we're looking at 429 00:19:00,480 --> 00:19:04,860 seems to match the population that it's drawn from, 430 00:19:04,860 --> 00:19:07,700 then this is probably true. 431 00:19:07,700 --> 00:19:10,290 So, representativeness heuristics, for all 432 00:19:10,290 --> 00:19:12,870 that we're going to talk about the places where they break, 433 00:19:12,870 --> 00:19:15,842 heuristics are generally useful. 434 00:19:15,842 --> 00:19:16,800 I mean, think about it. 435 00:19:16,800 --> 00:19:17,549 They've got to be. 436 00:19:17,549 --> 00:19:19,094 We wouldn't hang on to strategies 437 00:19:19,094 --> 00:19:20,760 for getting around in the world, if they 438 00:19:20,760 --> 00:19:23,410 weren't reasonably good at it. 439 00:19:23,410 --> 00:19:26,040 You'd get wiped out of the gene pool. 440 00:19:26,040 --> 00:19:27,720 It just wouldn't stick around. 441 00:19:27,720 --> 00:19:29,400 So, representativeness heuristics 442 00:19:29,400 --> 00:19:32,310 are good for some things. 443 00:19:32,310 --> 00:19:37,320 They fail when dealing with randomness-- people 444 00:19:37,320 --> 00:19:39,360 are generally bad with randomness. 445 00:19:39,360 --> 00:19:41,250 They also fail when you're thinking 446 00:19:41,250 --> 00:19:43,690 about complicated categories-- categories 447 00:19:43,690 --> 00:19:46,210 that human beings tend to fall into. 448 00:19:52,042 --> 00:20:05,910 So, If I was to describe for you my friend, Marie, and Marie 449 00:20:05,910 --> 00:20:08,850 is a sophomore in college. 450 00:20:08,850 --> 00:20:10,950 And she tends to wear a lot of black, 451 00:20:10,950 --> 00:20:13,530 like a black t-shirt with a long sleeve fishnet 452 00:20:13,530 --> 00:20:17,760 shirt underneath it, baggy black skater pants 453 00:20:17,760 --> 00:20:20,070 with some D-rings and straps on them. 454 00:20:20,070 --> 00:20:21,967 You guys know the look, right? 455 00:20:21,967 --> 00:20:23,550 Which of these two things do you think 456 00:20:23,550 --> 00:20:27,420 is more likely to be true about Marie? 457 00:20:27,420 --> 00:20:35,220 Marie is an economics major or Marie is an economics 458 00:20:35,220 --> 00:20:38,190 major and one of the folks who puts 459 00:20:38,190 --> 00:20:41,340 on Rocky Horror at our university every Halloween. 460 00:20:41,340 --> 00:20:45,562 Option A or option B, which is more likely? 461 00:20:45,562 --> 00:20:46,770 So, who thinks it's option A? 462 00:20:46,770 --> 00:20:47,830 Raise your hands. 463 00:20:47,830 --> 00:20:48,930 Who thinks it's option B? 464 00:20:48,930 --> 00:20:49,638 Raise your hands. 465 00:20:52,560 --> 00:20:54,800 Which is odd because the math says 466 00:20:54,800 --> 00:20:57,710 that if I've got a conjunction of two things, where Marie 467 00:20:57,710 --> 00:21:01,730 is both an economics major and helps put on the Rocky Horror 468 00:21:01,730 --> 00:21:05,360 Show, that can't be more likely than simply the fact 469 00:21:05,360 --> 00:21:06,960 that she's an economics major. 470 00:21:06,960 --> 00:21:10,100 So, this is a case where the fact that Marie, 471 00:21:10,100 --> 00:21:13,040 this small sample of one, matches 472 00:21:13,040 --> 00:21:17,079 our perception of people who do Rocky Horror better 473 00:21:17,079 --> 00:21:19,370 than our perception of people who are economics majors. 474 00:21:19,370 --> 00:21:24,110 That makes it seem like one of these things is more likely. 475 00:21:24,110 --> 00:21:25,470 Something to watch out for. 476 00:21:25,470 --> 00:21:28,070 So, this is, in part-- 477 00:21:28,070 --> 00:21:30,676 oh, one more one more kind of-- 478 00:21:30,676 --> 00:21:32,300 here, think about some stats questions. 479 00:21:32,300 --> 00:21:36,050 So, picture a town that has two hospitals, a big one 480 00:21:36,050 --> 00:21:38,690 and a little one. 481 00:21:38,690 --> 00:21:40,370 They both have an OBGYN unit. 482 00:21:40,370 --> 00:21:41,960 They both have babies. 483 00:21:41,960 --> 00:21:44,750 About 45 babies a day are born in the big hospital 484 00:21:44,750 --> 00:21:47,610 and about 15 in the little one. 485 00:21:47,610 --> 00:21:50,450 And you guys probably know that pretty much half of all babies 486 00:21:50,450 --> 00:21:51,711 are boys, right? 487 00:21:51,711 --> 00:21:54,210 But there's going to be some day to day fluctuation in this. 488 00:21:54,210 --> 00:21:56,543 So, you might have one day there's more boys than girls, 489 00:21:56,543 --> 00:21:58,250 one day there's more girls than boys. 490 00:21:58,250 --> 00:22:01,680 But over time, it would average out to 50%. 491 00:22:01,680 --> 00:22:03,830 So, suppose that these hospitals are keeping 492 00:22:03,830 --> 00:22:09,290 a record of how often there were more than 60% 493 00:22:09,290 --> 00:22:12,321 of the babies born were boys. 494 00:22:12,321 --> 00:22:13,820 Which hospital do you think is going 495 00:22:13,820 --> 00:22:16,500 to have more days in a given year 496 00:22:16,500 --> 00:22:20,780 where there are more than 60% boys? 497 00:22:20,780 --> 00:22:24,200 More than 60% of the babies born that day were boys. 498 00:22:24,200 --> 00:22:27,500 Larger or smaller or about the same? 499 00:22:33,470 --> 00:22:35,120 Who thinks it's larger? 500 00:22:35,120 --> 00:22:36,470 Smaller? 501 00:22:36,470 --> 00:22:39,440 About the same? 502 00:22:39,440 --> 00:22:41,370 That's what most people tend to say. 503 00:22:41,370 --> 00:22:45,650 But this is one of those things where you have 504 00:22:45,650 --> 00:22:48,390 what's called the small sample fallacy. 505 00:22:48,390 --> 00:22:51,410 So, the sample size is-- 506 00:22:51,410 --> 00:22:53,840 the fact that the smaller hospital has a smaller sample 507 00:22:53,840 --> 00:22:58,160 size, it's going to see skews from the average 508 00:22:58,160 --> 00:22:59,270 more frequently. 509 00:22:59,270 --> 00:23:01,310 If I toss a coin three times, you probably 510 00:23:01,310 --> 00:23:02,851 wouldn't be shocked if they turned up 511 00:23:02,851 --> 00:23:04,910 all heads or all tails, right? 512 00:23:04,910 --> 00:23:06,770 If I tossed the coin 20 times, and it 513 00:23:06,770 --> 00:23:09,080 was all heads or all tails, you'd be a little bit more 514 00:23:09,080 --> 00:23:09,830 taken aback. 515 00:23:14,590 --> 00:23:16,510 If I tossed it 100 times and got all hands, 516 00:23:16,510 --> 00:23:17,843 it would be really weird, right? 517 00:23:21,290 --> 00:23:22,940 Getting three heads for-- 518 00:23:25,650 --> 00:23:27,290 there's, what, like a 12.5% chance 519 00:23:27,290 --> 00:23:29,877 of getting three heads and something much smaller-- 520 00:23:29,877 --> 00:23:32,210 much, much much smaller than that-- chance of getting 20 521 00:23:32,210 --> 00:23:34,370 heads, 1 over 2 to the 20. 522 00:23:37,000 --> 00:23:41,490 So, when you take a small sample, 523 00:23:41,490 --> 00:23:44,260 the smaller your sample is relative to the population 524 00:23:44,260 --> 00:23:47,080 you're drawing from, then the more likely 525 00:23:47,080 --> 00:23:51,170 your sample is to be really different. 526 00:23:51,170 --> 00:23:54,102 Suppose I went to your high school and just closed my eyes, 527 00:23:54,102 --> 00:23:56,560 had the list of all the kids in your grade, closed my eyes, 528 00:23:56,560 --> 00:23:57,910 and picked three-- 529 00:23:57,910 --> 00:24:00,070 what are the chances they'd all be girls? 530 00:24:00,070 --> 00:24:01,840 Not huge but it wouldn't be shocking 531 00:24:01,840 --> 00:24:04,670 if I picked out three kids at random and they were all girls. 532 00:24:04,670 --> 00:24:06,940 What if I picked out 50? 533 00:24:06,940 --> 00:24:08,020 It's a lot less likely. 534 00:24:11,297 --> 00:24:13,630 The larger the sample that we're taking, the more likely 535 00:24:13,630 --> 00:24:16,630 it is to be representative of the population 536 00:24:16,630 --> 00:24:17,920 we're drawing from. 537 00:24:17,920 --> 00:24:20,440 So, one of the places where people make mistakes 538 00:24:20,440 --> 00:24:21,910 with this representative heuristic, 539 00:24:21,910 --> 00:24:24,500 is when they're looking at small sample sizes 540 00:24:24,500 --> 00:24:28,390 but not taking that into account when they try and consider what 541 00:24:28,390 --> 00:24:30,280 the larger population would be. 542 00:24:30,280 --> 00:24:31,870 This is the same mistake people make 543 00:24:31,870 --> 00:24:34,732 if they know three people who fall into a category 544 00:24:34,732 --> 00:24:36,940 and therefore make some kind of broad generalization, 545 00:24:36,940 --> 00:24:39,910 like, I know three guys who are engineers, 546 00:24:39,910 --> 00:24:42,400 therefore all engineers are guys. 547 00:24:42,400 --> 00:24:43,850 No, it doesn't work that way. 548 00:24:43,850 --> 00:24:46,000 But you've got to watch out for it. 549 00:24:46,000 --> 00:24:47,320 All right. 550 00:24:47,320 --> 00:24:52,970 Another pattern people use is what's called the availability 551 00:24:52,970 --> 00:24:53,470 heuristic. 552 00:24:53,470 --> 00:24:55,190 So, read this list of names. 553 00:24:58,250 --> 00:25:00,000 And look at me when you're done so I know. 554 00:25:06,505 --> 00:25:07,130 Just read them. 555 00:25:07,130 --> 00:25:09,505 I'm not going to quiz you on them or anything, I promise. 556 00:25:12,740 --> 00:25:13,880 All right. 557 00:25:13,880 --> 00:25:16,226 Notice anything about them? 558 00:25:16,226 --> 00:25:18,100 Anything that calls itself to your attention? 559 00:25:18,100 --> 00:25:19,391 Yeah, some of them are authors. 560 00:25:19,391 --> 00:25:21,060 Good. 561 00:25:21,060 --> 00:25:23,211 AUDIENCE: I only recognize the author names. 562 00:25:23,211 --> 00:25:24,460 ABBY NOYCE: Here's a question. 563 00:25:24,460 --> 00:25:26,702 Were there more women's names or men's names? 564 00:25:26,702 --> 00:25:27,368 AUDIENCE: Men's. 565 00:25:27,368 --> 00:25:27,862 AUDIENCE: Men's. 566 00:25:27,862 --> 00:25:28,945 AUDIENCE: I couldn't tell. 567 00:25:30,999 --> 00:25:33,290 ABBY NOYCE: Who thinks there were more guys' names than 568 00:25:33,290 --> 00:25:34,310 women's names? 569 00:25:34,310 --> 00:25:36,770 Who thinks there were more women's names than men's names? 570 00:25:36,770 --> 00:25:37,800 Got to pick one. 571 00:25:37,800 --> 00:25:38,510 I'm going to make you pick one. 572 00:25:38,510 --> 00:25:39,510 AUDIENCE: [INAUDIBLE] 573 00:25:39,510 --> 00:25:41,470 ABBY NOYCE: Nope. 574 00:25:41,470 --> 00:25:43,250 Forced choice, you've got to pick one. 575 00:25:43,250 --> 00:25:46,150 Who thinks there were more women than men listed there? 576 00:25:46,150 --> 00:25:47,462 AUDIENCE: [INAUDIBLE] 577 00:25:47,462 --> 00:25:49,670 ABBY NOYCE: Who thinks there were more men than women 578 00:25:49,670 --> 00:25:50,170 listed? 579 00:26:00,540 --> 00:26:02,071 So, there's more guys than women. 580 00:26:02,071 --> 00:26:03,570 On the other hand, you guys probably 581 00:26:03,570 --> 00:26:05,190 noticed that the women's names are 582 00:26:05,190 --> 00:26:06,824 names you recognized, right? 583 00:26:06,824 --> 00:26:07,740 At least some of them. 584 00:26:07,740 --> 00:26:09,230 The guys' names, probably not. 585 00:26:09,230 --> 00:26:11,470 The guys' names are just kind of pulled at random. 586 00:26:11,470 --> 00:26:14,940 So, the availability heuristic is a guideline we use, 587 00:26:14,940 --> 00:26:17,610 where we judge frequency of an occurrence 588 00:26:17,610 --> 00:26:21,480 based on how easy it is for us to pull up an example, 589 00:26:21,480 --> 00:26:24,060 pull up relevant examples of it. 590 00:26:24,060 --> 00:26:26,320 This is kind of the mistake that's being made-- 591 00:26:26,320 --> 00:26:28,590 who here knows somebody who is afraid of flying, 592 00:26:28,590 --> 00:26:30,720 doesn't like to fly, possibly because they 593 00:26:30,720 --> 00:26:33,303 think that they're going to die in some kind of terrible plane 594 00:26:33,303 --> 00:26:34,950 crash? 595 00:26:34,950 --> 00:26:36,880 Who here knows somebody who's afraid to fly, 596 00:26:36,880 --> 00:26:39,680 who drives or rides in cars that people drive? 597 00:26:43,650 --> 00:26:44,820 What's with that? 598 00:26:44,820 --> 00:26:47,130 We all know that you're way more likely to die terribly 599 00:26:47,130 --> 00:26:48,796 in a car accident than a plane accident, 600 00:26:48,796 --> 00:26:52,740 by like a factor of 100, right? 601 00:26:52,740 --> 00:26:56,410 But a lot of people feel like dying in a plane crash 602 00:26:56,410 --> 00:26:57,240 is more likely. 603 00:26:57,240 --> 00:27:00,150 And one of the reasons for this is that car accidents 604 00:27:00,150 --> 00:27:02,535 don't generally make the news. 605 00:27:02,535 --> 00:27:03,660 Car accidents happen. 606 00:27:03,660 --> 00:27:06,780 They might show up on page four of the newspaper or something. 607 00:27:06,780 --> 00:27:11,910 Whereas plane crashes are a week-long media extravaganza. 608 00:27:11,910 --> 00:27:14,965 They're on the news 24/7 if a plane goes down. 609 00:27:14,965 --> 00:27:16,340 You've probably encountered this. 610 00:27:16,340 --> 00:27:17,756 So, one of the things that happens 611 00:27:17,756 --> 00:27:21,290 is if you're trying to assess the relative frequency of car 612 00:27:21,290 --> 00:27:25,740 accidents versus plane crashes, then 613 00:27:25,740 --> 00:27:29,250 you might have an easier time retrieving the plane crash 614 00:27:29,250 --> 00:27:30,477 incidents from your memory. 615 00:27:30,477 --> 00:27:31,560 You heard more about them. 616 00:27:31,560 --> 00:27:33,060 You thought more about them. 617 00:27:33,060 --> 00:27:34,770 They were big news when they happened. 618 00:27:34,770 --> 00:27:38,280 And so, you perceive plane crashes 619 00:27:38,280 --> 00:27:41,940 as being more frequent than car crashes. 620 00:27:41,940 --> 00:27:46,260 So, this availability heuristic is good for lots of things. 621 00:27:46,260 --> 00:27:49,770 If I ask you, do you think there are 622 00:27:49,770 --> 00:27:52,491 more guys or girls in your high school class, 623 00:27:52,491 --> 00:27:53,490 or is it about the same? 624 00:28:02,115 --> 00:28:03,990 It's probably pretty close to about the same. 625 00:28:03,990 --> 00:28:07,510 But one of the ways that we answer this kind of question 626 00:28:07,510 --> 00:28:09,760 is by thinking through the members of our high school 627 00:28:09,760 --> 00:28:13,270 class that we know, and saying, how many of them are girls? 628 00:28:13,270 --> 00:28:15,960 How many of them are guys? 629 00:28:15,960 --> 00:28:19,925 And so, if the number of girls and guys that you can recall 630 00:28:19,925 --> 00:28:21,550 is roughly equal, you're going to guess 631 00:28:21,550 --> 00:28:23,410 that they're about the same. 632 00:28:23,410 --> 00:28:27,460 So, two things that affect how available something is 633 00:28:27,460 --> 00:28:30,010 in memory, are what's called-- 634 00:28:30,010 --> 00:28:32,150 one of them is recency, so, how recently we 635 00:28:32,150 --> 00:28:33,670 thought about something. 636 00:28:33,670 --> 00:28:36,434 So, if you ask people, what do you 637 00:28:36,434 --> 00:28:38,350 think your odds are of dying in a plane crash, 638 00:28:38,350 --> 00:28:41,140 and ask people to estimate this-- 639 00:28:41,140 --> 00:28:43,260 and you ask them to do it and then-- 640 00:28:43,260 --> 00:28:46,550 well probably if there was a big plane crash 641 00:28:46,550 --> 00:28:48,690 and it's in the news, and a week later you 642 00:28:48,690 --> 00:28:50,440 ask people to estimate what their odds are 643 00:28:50,440 --> 00:28:53,630 of dying in a plane crash, they'll give you one number. 644 00:28:53,630 --> 00:28:56,770 And suppose six months later, you find the same people 645 00:28:56,770 --> 00:29:00,530 and ask them to give you another estimate, which 646 00:29:00,530 --> 00:29:02,780 of these numbers do you think is going to be bigger? 647 00:29:02,780 --> 00:29:03,781 AUDIENCE: The first one? 648 00:29:03,781 --> 00:29:06,071 ABBY NOYCE: Probably the first one, the one immediately 649 00:29:06,071 --> 00:29:07,060 after whatever it is. 650 00:29:07,060 --> 00:29:09,730 Because the recency of the event makes 651 00:29:09,730 --> 00:29:11,170 it more available to them. 652 00:29:11,170 --> 00:29:12,250 It's easier to remember. 653 00:29:12,250 --> 00:29:14,930 It's right there on top of your memory. 654 00:29:14,930 --> 00:29:17,530 And so, your perception of how frequently the stuff happens 655 00:29:17,530 --> 00:29:19,442 is skewed by that. 656 00:29:19,442 --> 00:29:22,700 AUDIENCE: I heard that's how people choose 657 00:29:22,700 --> 00:29:24,399 the numbers that they gamble on when 658 00:29:24,399 --> 00:29:25,690 they're playing, say, roulette. 659 00:29:25,690 --> 00:29:29,155 They say, hey, this number has appeared frequently. 660 00:29:29,155 --> 00:29:32,310 I'm going to bet on this one because it's a "hot" number. 661 00:29:32,310 --> 00:29:34,120 ABBY NOYCE: Yeah, you'll see that. 662 00:29:34,120 --> 00:29:35,050 One of these things-- 663 00:29:35,050 --> 00:29:37,720 it's called the gambler's fallacy 664 00:29:37,720 --> 00:29:40,300 in something that's entirely random like roulette, 665 00:29:40,300 --> 00:29:42,070 like flipping a coin. 666 00:29:42,070 --> 00:29:44,560 Where-- but there's this kind of idea 667 00:29:44,560 --> 00:29:48,130 because it really feels like whatever has just happened 668 00:29:48,130 --> 00:29:50,740 is going to affect what happens next. 669 00:29:50,740 --> 00:29:52,915 Probably not true. 670 00:29:52,915 --> 00:29:54,545 AUDIENCE: OK, is there a game where 671 00:29:54,545 --> 00:29:58,461 you have a bullet in the gun but it's only one bullet and then 672 00:29:58,461 --> 00:30:00,710 you spin the barrel and then you try to kill yourself? 673 00:30:00,710 --> 00:30:01,930 AUDIENCE: I think that's Russian Roulette. 674 00:30:01,930 --> 00:30:02,386 ABBY NOYCE: Russian Roulette. 675 00:30:02,386 --> 00:30:04,210 AUDIENCE: It is a game, though. 676 00:30:04,210 --> 00:30:07,000 ABBY NOYCE: I'm not sure I'd call it a game, but yes. 677 00:30:11,214 --> 00:30:12,880 More often you'll hear it in the context 678 00:30:12,880 --> 00:30:15,100 of being used as a metaphor, like such and so 679 00:30:15,100 --> 00:30:17,190 is equivalent to playing-- 680 00:30:17,190 --> 00:30:30,970 [SIDE CONVERSATION] 681 00:30:30,970 --> 00:30:35,020 ABBY NOYCE: So, the other thing that 682 00:30:35,020 --> 00:30:39,100 affects this availability idea is how familiar you 683 00:30:39,100 --> 00:30:39,980 are with the outcome. 684 00:30:39,980 --> 00:30:44,290 So, again, and this is the thing where the level of-- 685 00:30:44,290 --> 00:30:47,410 how many people think that more kids 686 00:30:47,410 --> 00:30:51,012 get kidnapped in the United States today than 50 years ago? 687 00:30:51,012 --> 00:30:52,410 AUDIENCE: I don't know. 688 00:30:52,410 --> 00:30:54,280 I've never really thought about it. 689 00:30:54,280 --> 00:30:55,420 ABBY NOYCE: This is a more interesting question 690 00:30:55,420 --> 00:30:57,880 to ask grownups, people who actually were paying attention 691 00:30:57,880 --> 00:31:00,700 to the news 20, 30 years ago. 692 00:31:00,700 --> 00:31:05,045 Because the thing is that it's actually pretty stable-- 693 00:31:05,045 --> 00:31:07,170 kidnapping rates in this country are pretty stable. 694 00:31:07,170 --> 00:31:09,640 And yet, over time, you'll see things 695 00:31:09,640 --> 00:31:11,500 like parents get more concerned. 696 00:31:11,500 --> 00:31:14,470 They don't let their kids play on their own as much. 697 00:31:14,470 --> 00:31:18,930 They don't let their kids get places on their own as much. 698 00:31:18,930 --> 00:31:21,370 You'll see these patterns, where if you ask your parents, 699 00:31:21,370 --> 00:31:22,850 were they allowed to go around town alone? 700 00:31:22,850 --> 00:31:24,433 Were they allowed to take the T alone? 701 00:31:24,433 --> 00:31:27,160 Were they allowed to ride their bike to school? 702 00:31:27,160 --> 00:31:29,010 You'll see things where 30 years ago, people 703 00:31:29,010 --> 00:31:30,260 were allowed to do this stuff. 704 00:31:30,260 --> 00:31:31,218 And today, kids aren't. 705 00:31:31,218 --> 00:31:34,990 And one of the things I think that affects this-- 706 00:31:34,990 --> 00:31:36,310 is and this is probably-- 707 00:31:36,310 --> 00:31:38,726 I'm going to try not to let this devolve into my "the news 708 00:31:38,726 --> 00:31:41,164 media is out to get you" rant, which is separate. 709 00:31:41,164 --> 00:31:42,580 But I think one of the things that 710 00:31:42,580 --> 00:31:46,360 affects this is that there's been this transition, 711 00:31:46,360 --> 00:31:49,360 so that things like, for example, a missing 712 00:31:49,360 --> 00:31:51,711 kid goes from being local news in the city 713 00:31:51,711 --> 00:31:53,710 where it happened in the immediate surroundings, 714 00:31:53,710 --> 00:31:56,440 to national news. 715 00:31:56,440 --> 00:32:00,700 So, our perception of how many of these 716 00:32:00,700 --> 00:32:02,950 are going on increases. 717 00:32:02,950 --> 00:32:04,450 And because we know more about them, 718 00:32:04,450 --> 00:32:07,720 we are more familiar with the incidents. 719 00:32:07,720 --> 00:32:11,214 And so, this increased familiarity, again, 720 00:32:11,214 --> 00:32:12,880 makes it easier for us to think of them. 721 00:32:12,880 --> 00:32:15,070 We judge them as being more risky, more frequent 722 00:32:15,070 --> 00:32:15,912 occurrences. 723 00:32:21,324 --> 00:32:23,800 [INAUDIBLE] 724 00:32:23,800 --> 00:32:26,860 One more heuristic that we tend to use, 725 00:32:26,860 --> 00:32:28,130 that I wanted to bring up-- 726 00:32:28,130 --> 00:32:31,190 this is what's called the anchoring and adjustment 727 00:32:31,190 --> 00:32:31,690 heuristic. 728 00:32:31,690 --> 00:32:35,890 So, when we're trying to estimate the size of something 729 00:32:35,890 --> 00:32:38,140 or figure out, I don't know, how much should it 730 00:32:38,140 --> 00:32:41,870 cost me to fly to Denver, we'll make an approximation. 731 00:32:41,870 --> 00:32:43,690 And then as we get new information, 732 00:32:43,690 --> 00:32:46,810 we'll adjust that approximation. 733 00:32:46,810 --> 00:32:49,360 But what often tends to happen, is 734 00:32:49,360 --> 00:32:53,350 that that first number, that anchor number, we 735 00:32:53,350 --> 00:32:54,472 kind of treat it too much. 736 00:32:54,472 --> 00:32:55,930 Even if it was just a random guess, 737 00:32:55,930 --> 00:32:59,560 even if it had nothing to do, whatsoever, with whatever we're 738 00:32:59,560 --> 00:33:06,609 trying to estimate, it tends to kind of affect the changes 739 00:33:06,609 --> 00:33:07,150 that we make. 740 00:33:07,150 --> 00:33:09,910 We won't make big enough changes away from that original anchor. 741 00:33:15,616 --> 00:33:16,990 So, one example of this would be, 742 00:33:16,990 --> 00:33:20,340 suppose I want to fly to Denver for this weekend-- which 743 00:33:20,340 --> 00:33:22,917 I do, but I'm not going to because I'm poor 744 00:33:22,917 --> 00:33:24,750 and Denver is expensive to get to because it 745 00:33:24,750 --> 00:33:26,910 takes a plane to it. 746 00:33:26,910 --> 00:33:29,790 Colorado. 747 00:33:29,790 --> 00:33:35,620 And suppose I said, OK, I think flying to Denver-- 748 00:33:35,620 --> 00:33:37,620 Suppose I set aside some money to fly to Denver, 749 00:33:37,620 --> 00:33:40,110 and I start looking for airplane tickets. 750 00:33:40,110 --> 00:33:45,420 And the first airplane tickets I look at are like $1,200. 751 00:33:45,420 --> 00:33:48,120 And I go, oh, boy, no not so much. 752 00:33:48,120 --> 00:33:52,290 And I start doing some hunting, some looking for cheaper fares. 753 00:33:52,290 --> 00:33:55,440 And, eventually, I hunt and I find smaller fares 754 00:33:55,440 --> 00:33:57,060 and smaller fares. 755 00:33:57,060 --> 00:34:00,420 And by the time I find fares that are $700, 756 00:34:00,420 --> 00:34:03,220 it looks like a really good deal. 757 00:34:03,220 --> 00:34:06,990 So, I buy them because $700 is a lot less than what I originally 758 00:34:06,990 --> 00:34:08,190 saw. 759 00:34:08,190 --> 00:34:13,080 On the other hand, let's still say 760 00:34:13,080 --> 00:34:16,409 if I budgeted $500 to do this, then $700 761 00:34:16,409 --> 00:34:19,020 is more than I had anticipated to spend. 762 00:34:19,020 --> 00:34:25,010 But because I had one idea set as this anchor of $1,200, 763 00:34:25,010 --> 00:34:27,794 then adjustments from that feel significant, 764 00:34:27,794 --> 00:34:28,710 even when they're not. 765 00:34:28,710 --> 00:34:30,126 Even if I hadn't still gotten down 766 00:34:30,126 --> 00:34:33,270 to what I'd planned to spend on tickets, 767 00:34:33,270 --> 00:34:34,929 it felt like it was a much better deal 768 00:34:34,929 --> 00:34:36,179 than what I'd originally seen. 769 00:34:36,179 --> 00:34:39,289 So, this is the anchoring an adjustment heuristic. 770 00:34:39,289 --> 00:34:40,830 And so, what's interesting with this, 771 00:34:40,830 --> 00:34:44,730 is that people do this even when the information that 772 00:34:44,730 --> 00:34:48,360 seems to being set as an anchor has nothing, whatsoever, 773 00:34:48,360 --> 00:34:51,159 to do with what they're trying to estimate. 774 00:34:51,159 --> 00:34:58,620 So, Tversky and Kahneman asked people to estimate percentages. 775 00:34:58,620 --> 00:35:01,050 What percentage of UN delegates are from Africa? 776 00:35:01,050 --> 00:35:04,200 What percentage of states in the United States 777 00:35:04,200 --> 00:35:06,810 have more than one state university? 778 00:35:06,810 --> 00:35:09,390 What percentage of-- probably a lot higher now than it was 779 00:35:09,390 --> 00:35:10,950 in 1974-- 780 00:35:10,950 --> 00:35:13,920 what percentage of, I don't know, 781 00:35:13,920 --> 00:35:16,260 people in the United States live in a city? 782 00:35:16,260 --> 00:35:17,940 Various percentages. 783 00:35:17,940 --> 00:35:22,140 And they would ask the participant this question, 784 00:35:22,140 --> 00:35:24,900 and they had a wheel, like a mini Wheel of Fortune wheel, 785 00:35:24,900 --> 00:35:27,510 with just the numbers from 0 to 100 arranged 786 00:35:27,510 --> 00:35:29,410 in order around the rim. 787 00:35:29,410 --> 00:35:31,860 And so they would ask the question, 788 00:35:31,860 --> 00:35:35,100 they would spin the wheel, and then the wheel 789 00:35:35,100 --> 00:35:37,230 would stop on something. 790 00:35:37,230 --> 00:35:39,420 And then they'd have the participants indicate 791 00:35:39,420 --> 00:35:41,430 their answer by moving the wheel so 792 00:35:41,430 --> 00:35:43,110 that the pointer was on the percentage 793 00:35:43,110 --> 00:35:44,235 that they thought was true. 794 00:35:47,000 --> 00:35:47,900 We got this? 795 00:35:47,900 --> 00:35:48,500 So, they've got a wheel. 796 00:35:48,500 --> 00:35:49,590 They spin the wheel. 797 00:35:49,590 --> 00:35:52,349 So, if I ask you, what percentage of UN delegates 798 00:35:52,349 --> 00:35:53,390 are from African nations. 799 00:35:53,390 --> 00:35:55,850 I spin the wheel, the wheel hits 10, 800 00:35:55,850 --> 00:35:58,100 you think it's probably higher than 10-- more than 10% 801 00:35:58,100 --> 00:35:59,780 of countries are in Africa. 802 00:35:59,780 --> 00:36:02,130 So, you're going to move it up from there. 803 00:36:02,130 --> 00:36:04,220 And what's interesting-- what they found, 804 00:36:04,220 --> 00:36:08,150 is that this entirely randomly chosen number-- 805 00:36:08,150 --> 00:36:09,980 I spun this wheel right in front of you, 806 00:36:09,980 --> 00:36:12,440 you saw me do it you know it's random-- 807 00:36:12,440 --> 00:36:14,630 affects the outcomes that people give. 808 00:36:14,630 --> 00:36:19,280 You know, it's if wheel stops on a low number, most 809 00:36:19,280 --> 00:36:22,310 of the people are moving it up from that low number. 810 00:36:22,310 --> 00:36:25,760 The estimates that they give are lower than 811 00:36:25,760 --> 00:36:29,330 if it lands on a high number and people are moving it down, 812 00:36:29,330 --> 00:36:30,503 consistently. 813 00:36:30,503 --> 00:36:32,435 AUDIENCE: Is there a set interval? 814 00:36:36,590 --> 00:36:38,390 ABBY NOYCE: I don't know if they compared-- 815 00:36:38,390 --> 00:36:39,765 the way I would probably do this, 816 00:36:39,765 --> 00:36:41,270 is I would compare a group where I 817 00:36:41,270 --> 00:36:42,894 didn't have any wheel-spinning nonsense 818 00:36:42,894 --> 00:36:45,230 and just had them guess a percentage, 819 00:36:45,230 --> 00:36:47,030 and took the average of those, so that I 820 00:36:47,030 --> 00:36:50,120 could see, what does the general population believe 821 00:36:50,120 --> 00:36:52,505 the number of African nations in the UN 822 00:36:52,505 --> 00:36:54,590 to be, and compare from there. 823 00:36:54,590 --> 00:36:58,400 But I couldn't find any data that they had done this. 824 00:36:58,400 --> 00:37:01,970 The example that the book in which I was reading 825 00:37:01,970 --> 00:37:04,550 about this experiment gave-- 826 00:37:04,550 --> 00:37:07,370 they gave an example of the wheel landing on 10. 827 00:37:07,370 --> 00:37:10,352 Participants would give an answer of 25%. 828 00:37:10,352 --> 00:37:14,950 The wheel landing on 60, people had to move it down. 829 00:37:14,950 --> 00:37:16,740 They'd give an answer of more like 45%. 830 00:37:16,740 --> 00:37:19,190 So, a reasonably big range between these two but I 831 00:37:19,190 --> 00:37:21,868 don't have a good solid answer for it. 832 00:37:21,868 --> 00:37:23,365 AUDIENCE: Could it be because they 833 00:37:23,365 --> 00:37:26,360 wouldn't want to move it so much from where it was randomly-- 834 00:37:26,360 --> 00:37:28,190 ABBY NOYCE: --because it was so heavy. 835 00:37:28,190 --> 00:37:28,984 Well, yeah. 836 00:37:28,984 --> 00:37:30,650 That's pretty much what we're saying, is 837 00:37:30,650 --> 00:37:34,220 that once a number has been set on, where-- 838 00:37:34,220 --> 00:37:36,800 not even consciously, but we're reluctant to move too far 839 00:37:36,800 --> 00:37:37,730 away from it. 840 00:37:37,730 --> 00:37:39,782 So, we tend to want to make small changes rather 841 00:37:39,782 --> 00:37:40,490 than big changes. 842 00:37:43,760 --> 00:37:46,514 It's like, suppose your favorite ice cream place 843 00:37:46,514 --> 00:37:47,930 kept the cost of an ice cream cone 844 00:37:47,930 --> 00:37:49,730 at $2 for years and years and years. 845 00:37:49,730 --> 00:37:52,010 And then all of a sudden, it went up to $4. 846 00:37:52,010 --> 00:37:53,360 You'd be like, what? 847 00:37:53,360 --> 00:37:57,252 The price of my ice cream just doubled, I don't think so. 848 00:37:57,252 --> 00:37:58,365 AUDIENCE: Ridiculous! 849 00:37:58,365 --> 00:37:59,990 ABBY NOYCE: You'd be ticked off, right? 850 00:37:59,990 --> 00:38:01,760 You'd probably buy less ice cream. 851 00:38:01,760 --> 00:38:04,709 But if it went up gradually over the same amount of time, 852 00:38:04,709 --> 00:38:06,500 like, instead of being the same for so long 853 00:38:06,500 --> 00:38:10,130 and then going up suddenly, if it went up by $0.25 854 00:38:10,130 --> 00:38:13,310 a year or something, you'd be a lot less averse that. 855 00:38:13,310 --> 00:38:16,280 So, that's another example where big changes in some kind 856 00:38:16,280 --> 00:38:18,249 of numbers are disconcerting. 857 00:38:18,249 --> 00:38:19,040 We don't like them. 858 00:38:19,040 --> 00:38:20,540 Little changes we're pretty OK with. 859 00:38:31,961 --> 00:38:32,955 All right. 860 00:38:35,940 --> 00:38:38,280 So, kind of take-away things from this. 861 00:38:38,280 --> 00:38:41,010 So, decisions that we make, judgments that we make, 862 00:38:41,010 --> 00:38:43,860 all of this are influenced by a lot of things 863 00:38:43,860 --> 00:38:45,890 that we aren't directly aware of. 864 00:38:45,890 --> 00:38:46,980 One of the things that's interesting in all 865 00:38:46,980 --> 00:38:48,896 of this heuristic study, is when you point out 866 00:38:48,896 --> 00:38:51,370 to people that people tend to use these heuristics, 867 00:38:51,370 --> 00:38:54,094 that they tend to make these kinds of statistical decisions, 868 00:38:54,094 --> 00:38:56,010 that this is a piece of information you should 869 00:38:56,010 --> 00:38:59,410 take into account, they get better at it. 870 00:38:59,410 --> 00:39:01,290 So, there's at least-- 871 00:39:01,290 --> 00:39:04,380 so, we use these heuristics and kind of naively 872 00:39:04,380 --> 00:39:07,410 we don't think about what the flaws in them are, 873 00:39:07,410 --> 00:39:10,800 but people can learn to take that into account 874 00:39:10,800 --> 00:39:12,780 and make better decisions and more accurate 875 00:39:12,780 --> 00:39:17,260 estimates by being aware of how their heuristic systems work. 876 00:39:22,249 --> 00:39:22,749 OK. 877 00:39:42,740 --> 00:39:44,390 Shifting gears a little bit, I want 878 00:39:44,390 --> 00:39:48,480 to talk about one particular study. 879 00:39:48,480 --> 00:39:51,330 So, we were talking a minute ago-- 880 00:39:51,330 --> 00:39:54,090 for the first part of this-- about the strategies 881 00:39:54,090 --> 00:39:56,090 that people use to make decisions in their life. 882 00:39:58,730 --> 00:40:01,700 These guys are looking at what some 883 00:40:01,700 --> 00:40:03,980 of the underlying neuroscience is in this. 884 00:40:03,980 --> 00:40:06,400 And they had people do a very simple task. 885 00:40:06,400 --> 00:40:10,160 They said, we're going to have you 886 00:40:10,160 --> 00:40:13,370 have a button in each hand and whenever you feel like it, 887 00:40:13,370 --> 00:40:15,320 push one of them. 888 00:40:15,320 --> 00:40:17,290 So, they had people do this. 889 00:40:17,290 --> 00:40:20,330 And they had them sit-in an FMRI tube. 890 00:40:20,330 --> 00:40:22,910 And they wanted to know, is there 891 00:40:22,910 --> 00:40:25,550 any way that we can measure what kind of decision people 892 00:40:25,550 --> 00:40:29,330 are going to take before they're aware that they've decided it? 893 00:40:29,330 --> 00:40:30,761 So, here you are. 894 00:40:30,761 --> 00:40:32,510 You're sitting in the FMRI tube and you're 895 00:40:32,510 --> 00:40:37,310 seeing a screen which is showing a stream of letters 896 00:40:37,310 --> 00:40:40,337 for 500 milliseconds, for half a second, each. 897 00:40:40,337 --> 00:40:42,170 So, it'd show a letter, show another letter, 898 00:40:42,170 --> 00:40:44,450 show another letter, show another letter. 899 00:40:44,450 --> 00:40:45,880 And you're just hanging out there. 900 00:40:45,880 --> 00:40:47,590 And whenever you feel like it, you 901 00:40:47,590 --> 00:40:49,370 are asked to do two things-- 902 00:40:49,370 --> 00:40:54,020 to press either the left or the right button 903 00:40:54,020 --> 00:40:56,720 and also to make a note of which letter 904 00:40:56,720 --> 00:41:00,290 was being shown at the moment that you made the decision. 905 00:41:00,290 --> 00:41:03,140 So, you were asked to decide and press right away-- 906 00:41:03,140 --> 00:41:06,230 not to decide and then wait for a minute, just make a decision, 907 00:41:06,230 --> 00:41:10,440 push the button, and notice which letter was being shown. 908 00:41:10,440 --> 00:41:13,610 So, what they did is after you pushed the button, 909 00:41:13,610 --> 00:41:17,120 you got a screen and you were asked 910 00:41:17,120 --> 00:41:20,240 to indicate which letter had been 911 00:41:20,240 --> 00:41:23,700 displayed when you had made the decision to press the button. 912 00:41:23,700 --> 00:41:29,600 So, if I'm sitting here watching the screens 913 00:41:29,600 --> 00:41:34,430 and in this case, when the moment when the Q is shown, 914 00:41:34,430 --> 00:41:37,190 is when I decide to press the button. 915 00:41:37,190 --> 00:41:40,040 And no matter how fast I go from decision to button press-- 916 00:41:40,040 --> 00:41:41,660 it might be a little bit slow-- 917 00:41:41,660 --> 00:41:43,940 so what's actually being displayed 918 00:41:43,940 --> 00:41:48,430 when the computer detects the button press, might be the V. 919 00:41:48,430 --> 00:41:52,410 And after I press the button, the computer's going to ask me, 920 00:41:52,410 --> 00:41:57,354 of these three previous slides, which one was showing when you 921 00:41:57,354 --> 00:41:58,520 decided to press the button? 922 00:41:58,520 --> 00:42:01,100 And if the Q was showing when I decided to press the button, 923 00:42:01,100 --> 00:42:02,400 I'd indicate that. 924 00:42:02,400 --> 00:42:05,882 I'd say it was the Q and move on and do it again. 925 00:42:05,882 --> 00:42:08,090 So, is what they're asking subjects to do make sense? 926 00:42:10,801 --> 00:42:13,050 Decide to push a button, remember which letter it was, 927 00:42:13,050 --> 00:42:14,877 tell us which letter it was. 928 00:42:14,877 --> 00:42:16,460 And what these guys wanted to find out 929 00:42:16,460 --> 00:42:17,270 was a couple of things. 930 00:42:17,270 --> 00:42:19,436 There's been research in the literature for a while, 931 00:42:19,436 --> 00:42:22,490 for about 20 years now, showing that when 932 00:42:22,490 --> 00:42:24,620 people decide to make a motor movement, 933 00:42:24,620 --> 00:42:26,010 like pressing a button-- 934 00:42:26,010 --> 00:42:30,170 which is a motor response; it's a very small one compared 935 00:42:30,170 --> 00:42:32,940 to a lot of other motor things-- 936 00:42:32,940 --> 00:42:36,886 then at least some period in advance of that, 937 00:42:36,886 --> 00:42:38,510 maybe half a second in advance of that, 938 00:42:38,510 --> 00:42:41,650 there's activity in the brain that 939 00:42:41,650 --> 00:42:43,400 seems to indicate that a decision has been 940 00:42:43,400 --> 00:42:45,852 made before the participant is aware 941 00:42:45,852 --> 00:42:47,060 that they've made a decision. 942 00:42:47,060 --> 00:42:50,510 Before you say, I've decided, before you can think that, 943 00:42:50,510 --> 00:42:54,650 your brain apparently already knows that you've decided. 944 00:42:54,650 --> 00:42:55,900 Creepy, yeah. 945 00:42:55,900 --> 00:42:59,570 So, these guys said, OK, the previous rounds of this 946 00:42:59,570 --> 00:43:03,170 had only asked people to push a single button. 947 00:43:03,170 --> 00:43:04,910 They didn't have a choice. 948 00:43:04,910 --> 00:43:07,640 So, we want to see if we can predict the choice that people 949 00:43:07,640 --> 00:43:07,810 make. 950 00:43:07,810 --> 00:43:09,018 Here they've got two options. 951 00:43:09,018 --> 00:43:11,060 You can push either the button in your left hand 952 00:43:11,060 --> 00:43:12,714 or the button in your right hand. 953 00:43:12,714 --> 00:43:14,630 And they want to see if they can predict which 954 00:43:14,630 --> 00:43:17,960 choice people make before the people are 955 00:43:17,960 --> 00:43:20,700 aware that they've done it. 956 00:43:20,700 --> 00:43:23,900 So, what did they find? 957 00:43:23,900 --> 00:43:25,070 Look, pretty brain pictures. 958 00:43:25,070 --> 00:43:27,010 OK. 959 00:43:27,010 --> 00:43:29,870 So, they had people do this in an FMRI. 960 00:43:29,870 --> 00:43:31,980 They looked at their patterns of brain activity 961 00:43:31,980 --> 00:43:35,570 in particular regions over a temporal scale. 962 00:43:35,570 --> 00:43:37,200 So, each of these-- as you can see, 963 00:43:37,200 --> 00:43:38,950 there's different particular brain regions 964 00:43:38,950 --> 00:43:41,360 that are being graphed here. 965 00:43:41,360 --> 00:43:45,410 each of these graphs has time along the bottom axis 966 00:43:45,410 --> 00:43:47,060 in seconds. 967 00:43:47,060 --> 00:43:49,310 This red line is the point at which 968 00:43:49,310 --> 00:43:52,820 subjects reported that they had decided 969 00:43:52,820 --> 00:43:55,880 to make a decision using that-- 970 00:43:55,880 --> 00:43:58,170 which letter was showing when you decided-- 971 00:43:58,170 --> 00:43:59,130 that method. 972 00:43:59,130 --> 00:44:05,240 And then this is showing how accurately the vertical axis 973 00:44:05,240 --> 00:44:05,770 is showing. 974 00:44:05,770 --> 00:44:07,520 So, they looked at the particular patterns 975 00:44:07,520 --> 00:44:09,560 of activation in each of these regions. 976 00:44:09,560 --> 00:44:14,390 And they said, can we find a pattern here that predicts-- 977 00:44:14,390 --> 00:44:16,490 Using the information in this brain region, 978 00:44:16,490 --> 00:44:20,180 at this point in time, can we predict which choice you made? 979 00:44:20,180 --> 00:44:22,670 So, this is the 50% line here. 980 00:44:22,670 --> 00:44:24,580 So, that's chance. 981 00:44:24,580 --> 00:44:26,330 And then anything up above that-- so, this 982 00:44:26,330 --> 00:44:28,040 would be like 75% accuracy. 983 00:44:28,040 --> 00:44:31,010 That's pretty good. 984 00:44:31,010 --> 00:44:32,960 So, the filled-in black dots are places 985 00:44:32,960 --> 00:44:35,310 where they're significantly better than chance 986 00:44:35,310 --> 00:44:37,847 at determining which choice was made. 987 00:44:37,847 --> 00:44:39,680 The white ones are places where they're not, 988 00:44:39,680 --> 00:44:42,700 where it's just the same as chance. 989 00:44:42,700 --> 00:44:44,100 So, what did they find out? 990 00:44:44,100 --> 00:44:47,590 Well, they found out that after subjects 991 00:44:47,590 --> 00:44:50,620 made this conscious decision to push the button, then 992 00:44:50,620 --> 00:44:54,315 all of these motor cortex areas are really good at predicting 993 00:44:54,315 --> 00:44:56,440 whether you're going to press it with the left hand 994 00:44:56,440 --> 00:44:58,580 or the right hand, which isn't too surprising. 995 00:44:58,580 --> 00:45:01,604 Motor cortex is involved in all of these kinds of decisions. 996 00:45:04,330 --> 00:45:08,080 Left motor cortex, right motor cortex, the supplemental motor 997 00:45:08,080 --> 00:45:10,030 area and the pre-supplemental motor area, 998 00:45:10,030 --> 00:45:13,030 which are, again, as you guys might remember from last week, 999 00:45:13,030 --> 00:45:16,270 are areas that seem to deal with planning 1000 00:45:16,270 --> 00:45:20,320 activities and goal-directed motion and stuff. 1001 00:45:20,320 --> 00:45:21,790 And so, all of these are areas that 1002 00:45:21,790 --> 00:45:25,240 can identify what choice you're making after you've consciously 1003 00:45:25,240 --> 00:45:26,350 made it. 1004 00:45:26,350 --> 00:45:29,650 What's more interesting is these ones, 1005 00:45:29,650 --> 00:45:32,320 which show some amount of ability 1006 00:45:32,320 --> 00:45:35,200 to predict what choice you make before you're consciously 1007 00:45:35,200 --> 00:45:37,270 aware that you made it. 1008 00:45:37,270 --> 00:45:40,200 So, even way out here-- look at this. 1009 00:45:40,200 --> 00:45:43,840 This is frontopolar cortex, lateral and medial. 1010 00:45:43,840 --> 00:45:46,990 So, way out here. 1011 00:45:46,990 --> 00:45:49,840 Eight seconds before participants 1012 00:45:49,840 --> 00:45:54,370 report that they've made a decision, just looking 1013 00:45:54,370 --> 00:45:56,830 at activity in the frontopolar cortex, 1014 00:45:56,830 --> 00:46:00,302 these researchers could predict with 60% accuracy 1015 00:46:00,302 --> 00:46:02,260 whether you were going to push the right button 1016 00:46:02,260 --> 00:46:03,490 or the left button. 1017 00:46:03,490 --> 00:46:06,610 Eight seconds-- that's like forever in neuroscience terms-- 1018 00:46:06,610 --> 00:46:10,270 before you know that you made that decision. 1019 00:46:10,270 --> 00:46:13,900 I can tell you which decision you're going to make. 1020 00:46:13,900 --> 00:46:15,742 AUDIENCE: That's creepy. 1021 00:46:15,742 --> 00:46:16,450 ABBY NOYCE: Yeah. 1022 00:46:19,330 --> 00:46:23,791 Slightly sooner in posterior cingulate cortex. 1023 00:46:23,791 --> 00:46:28,540 It's not quite as good in medial frontopolar. 1024 00:46:28,540 --> 00:46:31,090 But they said if they combined the information from all three 1025 00:46:31,090 --> 00:46:33,470 of these areas then their accuracy actually went up. 1026 00:46:33,470 --> 00:46:36,730 It went up above that 60% to like 75%, 1027 00:46:36,730 --> 00:46:38,920 3/4 accuracy at predicting whether subjects 1028 00:46:38,920 --> 00:46:43,000 were going to push the left or the right button 1029 00:46:43,000 --> 00:46:48,006 before they thought they'd made that decision, which is cool. 1030 00:46:48,006 --> 00:46:50,754 AUDIENCE: Well, couldn't they have a 50% chance of 1031 00:46:50,754 --> 00:46:52,050 [INAUDIBLE]? 1032 00:46:52,050 --> 00:46:54,600 ABBY NOYCE: Right, so it's getting things 1033 00:46:54,600 --> 00:46:56,040 above 50% that are relevant. 1034 00:46:58,680 --> 00:47:03,900 So, again, the filled- incircles on the graphs are time points 1035 00:47:03,900 --> 00:47:07,950 at which the difference between their accuracy and chance is 1036 00:47:07,950 --> 00:47:10,664 statistically significant, which-- 1037 00:47:10,664 --> 00:47:12,330 it's not just random that they got this, 1038 00:47:12,330 --> 00:47:13,950 that it actually means something. 1039 00:47:13,950 --> 00:47:18,300 Whereas the white ones, where it's very close to chance, 1040 00:47:18,300 --> 00:47:19,692 are not significant. 1041 00:47:27,080 --> 00:47:32,706 Questions, comments, creepy? 1042 00:47:32,706 --> 00:47:35,081 AUDIENCE: I think there was an article in, like, The Wall 1043 00:47:35,081 --> 00:47:36,330 Street Journal about that. 1044 00:47:36,330 --> 00:47:37,913 ABBY NOYCE: Yeah, there's been results 1045 00:47:37,913 --> 00:47:40,170 like this for quite a while. 1046 00:47:40,170 --> 00:47:43,574 It's kind of an ongoing topic of research. 1047 00:47:43,574 --> 00:47:45,240 Like I said, the first people doing this 1048 00:47:45,240 --> 00:47:47,800 were doing it in the '80s, in the mid '80s. 1049 00:47:47,800 --> 00:47:50,460 But as the methods get better and as new techniques 1050 00:47:50,460 --> 00:47:51,132 get looked at-- 1051 00:47:51,132 --> 00:47:53,340 so I think this is the first one like this where they 1052 00:47:53,340 --> 00:47:56,240 have the two-choice option. 1053 00:47:56,240 --> 00:47:58,710 This is a paper from like May of this year, I want to say. 1054 00:47:58,710 --> 00:47:59,370 This is new. 1055 00:47:59,370 --> 00:48:01,082 This is shiny new. 1056 00:48:01,082 --> 00:48:04,554 AUDIENCE: What happens if you visualize 1057 00:48:04,554 --> 00:48:08,040 pressing with your left hand as you do it with your right? 1058 00:48:08,040 --> 00:48:09,844 ABBY NOYCE: I have no idea. 1059 00:48:09,844 --> 00:48:11,260 You'd probably mess up their data. 1060 00:48:14,820 --> 00:48:17,549 Only do this to researchers with whom you are not friends, 1061 00:48:17,549 --> 00:48:19,840 at least without telling them that's what you're doing. 1062 00:48:22,720 --> 00:48:32,110 AUDIENCE: That's a good test to do [INAUDIBLE] 1063 00:48:32,110 --> 00:48:33,540 ABBY NOYCE: Maybe. 1064 00:48:33,540 --> 00:48:34,860 Yeah. 1065 00:48:34,860 --> 00:48:36,630 And on the other hand, we know that when 1066 00:48:36,630 --> 00:48:39,270 you visualize doing something, at least to some extent, 1067 00:48:39,270 --> 00:48:41,436 the pieces of cortex that would actually be involved 1068 00:48:41,436 --> 00:48:43,480 in doing it, are active. 1069 00:48:43,480 --> 00:48:45,732 So, if I ask you to visualize a banana, the same parts 1070 00:48:45,732 --> 00:48:47,190 of your cortex that would be active 1071 00:48:47,190 --> 00:48:49,920 if you were actually looking at a, banana are active. 1072 00:48:49,920 --> 00:48:52,732 Or if I ask you to imagine standing up and doing 1073 00:48:52,732 --> 00:48:54,690 a jumping jack, then the pieces of motor cortex 1074 00:48:54,690 --> 00:48:56,800 that would be involved in that, are active. 1075 00:49:01,940 --> 00:49:03,120 All right. 1076 00:49:03,120 --> 00:49:04,700 So, this brings us around to some 1077 00:49:04,700 --> 00:49:09,850 of the big philosophical questions in this field. 1078 00:49:09,850 --> 00:49:11,900 So, we've been talking pretty much all 1079 00:49:11,900 --> 00:49:14,270 through this course with this underlying 1080 00:49:14,270 --> 00:49:19,327 assumption that our subjective experience of the world-- 1081 00:49:19,327 --> 00:49:21,410 our perceptions, our understandings, our memories, 1082 00:49:21,410 --> 00:49:23,200 all this-- 1083 00:49:23,200 --> 00:49:26,270 can be traced directly to physical stuff 1084 00:49:26,270 --> 00:49:28,700 going on in the brain. 1085 00:49:28,700 --> 00:49:33,020 And we're pretty sure that physical stuff has to be 1086 00:49:33,020 --> 00:49:35,120 caused by other physical stuff. 1087 00:49:35,120 --> 00:49:38,660 You can't have something that is only 1088 00:49:38,660 --> 00:49:41,030 mental without a physiological basis that 1089 00:49:41,030 --> 00:49:42,410 can then cause physical stuff. 1090 00:49:42,410 --> 00:49:43,977 So, physical changes in the brain 1091 00:49:43,977 --> 00:49:45,810 have to be caused by other physical changes. 1092 00:49:45,810 --> 00:49:48,200 Some of the stuff where we've looked at the chemistry 1093 00:49:48,200 --> 00:49:50,930 or the cellular-level changes underlying things-- 1094 00:49:50,930 --> 00:49:53,570 that's this idea that physical changes are caused 1095 00:49:53,570 --> 00:49:56,629 by other physical events. 1096 00:49:56,629 --> 00:49:58,670 And this starts bringing us into all of these fun 1097 00:49:58,670 --> 00:50:01,070 questions about free will. 1098 00:50:01,070 --> 00:50:05,090 So, if you decide to do something 1099 00:50:05,090 --> 00:50:07,432 based on the state your brain is in right now 1100 00:50:07,432 --> 00:50:09,890 and the state your brain is in right now can be traced back 1101 00:50:09,890 --> 00:50:14,390 to every state your brain has always been in then 1102 00:50:14,390 --> 00:50:17,180 do you get to make a free decision 1103 00:50:17,180 --> 00:50:20,720 or are all of your decisions predetermined 1104 00:50:20,720 --> 00:50:22,430 by the experiences you've previously had? 1105 00:50:29,248 --> 00:50:31,683 AUDIENCE: This class has made me stop trusting everything. 1106 00:50:31,683 --> 00:50:34,610 AUDIENCE: That's a logical argument. 1107 00:50:34,610 --> 00:50:36,539 ABBY NOYCE: I'm OK with that as an outcome. 1108 00:50:36,539 --> 00:50:37,580 AUDIENCE: I would say no. 1109 00:50:37,580 --> 00:50:42,060 But I would say both answers could be arguments 1110 00:50:42,060 --> 00:50:49,040 because if you said yes, you could think that it's-- 1111 00:50:49,040 --> 00:50:50,440 I don't know. 1112 00:50:50,440 --> 00:50:52,460 ABBY NOYCE: So here's a question. 1113 00:50:52,460 --> 00:50:57,250 Suppose you are on the jury in a criminal case for a guy who's 1114 00:50:57,250 --> 00:51:00,910 been accused of beating up a bunch of people, 1115 00:51:00,910 --> 00:51:04,920 like a serial assault, like assault on a number of people, 1116 00:51:04,920 --> 00:51:07,030 has injured people. 1117 00:51:07,030 --> 00:51:12,580 And what his attorney says is that this guy-- 1118 00:51:12,580 --> 00:51:15,490 you get lots of witnesses saying that up until five years ago 1119 00:51:15,490 --> 00:51:17,770 this guy was gentle as a lamb, never hurt 1120 00:51:17,770 --> 00:51:21,130 anybody in his entire life. 1121 00:51:21,130 --> 00:51:24,097 And his attorney says that he got a brain tumor 1122 00:51:24,097 --> 00:51:25,930 and he now has a brain tumor that has caused 1123 00:51:25,930 --> 00:51:30,190 this change in his behavior. 1124 00:51:30,190 --> 00:51:34,080 Is this guy responsible for what he has done? 1125 00:51:34,080 --> 00:51:35,097 Should he be convicted? 1126 00:51:35,097 --> 00:51:35,680 AUDIENCE: Yes. 1127 00:51:35,680 --> 00:51:36,115 AUDIENCE: No. 1128 00:51:36,115 --> 00:51:36,698 AUDIENCE: Yes. 1129 00:51:36,698 --> 00:51:37,247 AUDIENCE: No. 1130 00:51:37,247 --> 00:51:37,830 AUDIENCE: Yes. 1131 00:51:49,582 --> 00:51:51,706 AUDIENCE: --because he got a brain tumor, 1132 00:51:51,706 --> 00:51:54,008 then it's not his fault. He didn't 1133 00:51:54,008 --> 00:51:55,499 choose to get a brain tumor. 1134 00:51:55,499 --> 00:51:57,487 AUDIENCE: Yeah, but he still hurt other people. 1135 00:52:03,645 --> 00:52:04,270 ABBY NOYCE: OK. 1136 00:52:08,860 --> 00:52:12,887 What happens if we say, OK, you have to be treated for this. 1137 00:52:12,887 --> 00:52:15,220 We remove the brain tumor, and this guy goes back to his 1138 00:52:15,220 --> 00:52:20,140 previous exemplary, gentle-as-a-lamb lifestyle? 1139 00:52:20,140 --> 00:52:22,764 We postponed the trial for a year for him 1140 00:52:22,764 --> 00:52:23,680 to get this treatment. 1141 00:52:23,680 --> 00:52:24,530 Then what? 1142 00:52:24,530 --> 00:52:26,030 You going to throw this guy in jail? 1143 00:52:26,030 --> 00:52:26,650 He's got kids. 1144 00:52:26,650 --> 00:52:27,527 He's got family. 1145 00:52:27,527 --> 00:52:28,360 He's got a good job. 1146 00:52:28,360 --> 00:52:31,855 He's a staple of his community. 1147 00:52:31,855 --> 00:52:34,105 But he beat up 10 people and put them in the hospital. 1148 00:52:38,070 --> 00:52:39,890 These aren't easy questions. 1149 00:52:39,890 --> 00:52:42,160 And this is kind of one of the more clear-cut cases. 1150 00:52:42,160 --> 00:52:48,640 But we're also starting to see-- 1151 00:52:48,640 --> 00:52:51,610 and I think we're going to see more of this-- 1152 00:52:51,610 --> 00:52:52,810 for example, OK. 1153 00:52:52,810 --> 00:52:55,070 So, that's an extreme case. 1154 00:52:55,070 --> 00:52:56,230 What about 1155 00:52:56,230 --> 00:52:58,014 AUDIENCE: People who plead insanity 1156 00:52:58,014 --> 00:53:01,210 after doing something really stupid? 1157 00:53:01,210 --> 00:53:03,460 ABBY NOYCE: Insanity plea is an interesting thing. 1158 00:53:03,460 --> 00:53:09,190 Legally it's got to be based on a professional's judgment 1159 00:53:09,190 --> 00:53:11,470 that the person did not know that what they were doing 1160 00:53:11,470 --> 00:53:12,886 was wrong at the time they did it. 1161 00:53:18,436 --> 00:53:19,810 So, the classic good case of this 1162 00:53:19,810 --> 00:53:22,300 is, for example, somebody who is severely mentally retarded 1163 00:53:22,300 --> 00:53:26,140 and doesn't know that it's wrong to take a candy bar, 1164 00:53:26,140 --> 00:53:28,300 for an example of this. 1165 00:53:28,300 --> 00:53:30,248 AUDIENCE: But what if somebody's a sociopath, 1166 00:53:30,248 --> 00:53:32,690 who doesn't realize that it's wrong to kill people? 1167 00:53:32,690 --> 00:53:36,725 ABBY NOYCE: So, that's where you don't send this person 1168 00:53:36,725 --> 00:53:37,600 to the death penalty. 1169 00:53:37,600 --> 00:53:39,430 You send them to involuntary committal. 1170 00:53:39,430 --> 00:53:41,200 You move them-- so, the state has 1171 00:53:41,200 --> 00:53:46,910 the right to-- if somebody is a danger to others, 1172 00:53:46,910 --> 00:53:49,990 the state has the right to get a judge to say, 1173 00:53:49,990 --> 00:53:52,450 this person can be locked up. 1174 00:53:52,450 --> 00:53:55,180 This person can be confined to a treatment center, 1175 00:53:55,180 --> 00:53:57,310 until they are no longer such a danger, which 1176 00:53:57,310 --> 00:53:59,922 may be indefinite. 1177 00:54:08,900 --> 00:54:10,430 So, what about something that's kind 1178 00:54:10,430 --> 00:54:14,240 of like the brain tumor issue. 1179 00:54:14,240 --> 00:54:17,889 So, we know that, for example, in a lot of ways, 1180 00:54:17,889 --> 00:54:19,430 experiences in early childhood really 1181 00:54:19,430 --> 00:54:22,475 strongly influence a lot of how your brain develops. 1182 00:54:25,070 --> 00:54:28,070 For example, we know that kids who 1183 00:54:28,070 --> 00:54:33,560 are abused as little kids tend to grow up 1184 00:54:33,560 --> 00:54:36,740 to be more aggressive, to have a hard time forming 1185 00:54:36,740 --> 00:54:39,710 relationships, to have, in general, 1186 00:54:39,710 --> 00:54:42,620 have more issues, all around, than kids who grew up 1187 00:54:42,620 --> 00:54:46,240 in stable, healthy families. 1188 00:54:46,240 --> 00:54:52,770 So, what happens when you have some young adult who 1189 00:54:52,770 --> 00:54:57,330 does something violent, who hurt somebody 1190 00:54:57,330 --> 00:55:02,010 and the case that their attorney makes is he couldn't help it. 1191 00:55:02,010 --> 00:55:04,960 His brain was wired to do this by his past. 1192 00:55:04,960 --> 00:55:07,800 Where people start saying that you lose-- that because 1193 00:55:07,800 --> 00:55:11,160 of how our brains are shaped by experience that happens 1194 00:55:11,160 --> 00:55:15,960 to them, and then our future behavior is defined 1195 00:55:15,960 --> 00:55:18,870 by these experiences, how much control 1196 00:55:18,870 --> 00:55:22,980 does one really have over what one does? 1197 00:55:22,980 --> 00:55:24,480 It starts getting kind of iffy. 1198 00:55:24,480 --> 00:55:26,796 And it's kind of dark to think about, in a lot of ways. 1199 00:55:26,796 --> 00:55:28,170 AUDIENCE: But then, wouldn't that 1200 00:55:28,170 --> 00:55:30,794 mean that no one's responsible for anything that they do? 1201 00:55:30,794 --> 00:55:31,460 ABBY NOYCE: Yes. 1202 00:55:31,460 --> 00:55:35,246 AUDIENCE: They can go around killing everybody, knowing-- 1203 00:55:35,246 --> 00:55:37,620 ABBY NOYCE: This is the slippery slope argument for this, 1204 00:55:37,620 --> 00:55:42,000 is that it's definitely possible to start out thinking about 1205 00:55:42,000 --> 00:55:44,820 this and come to the conclusion that there is no such thing 1206 00:55:44,820 --> 00:55:47,850 as personal responsibility, that you cannot be held responsible 1207 00:55:47,850 --> 00:55:50,790 for anything you do because it's all just your brain doing it 1208 00:55:50,790 --> 00:55:53,310 and it's all defined by personal experience, 1209 00:55:53,310 --> 00:55:55,380 previous experience. 1210 00:55:55,380 --> 00:56:02,020 It's all mechanical, that there isn't a choice option involved. 1211 00:56:02,020 --> 00:56:05,625 AUDIENCE: That's why I think that, legally, they should not 1212 00:56:05,625 --> 00:56:06,632 take into account-- 1213 00:56:06,632 --> 00:56:08,840 they should assume that everyone does have free will. 1214 00:56:08,840 --> 00:56:12,330 Because otherwise, it wouldn't work. 1215 00:56:12,330 --> 00:56:15,697 Nothing would actually work. 1216 00:56:15,697 --> 00:56:17,530 ABBY NOYCE: Yes, this is an excellent point. 1217 00:56:17,530 --> 00:56:18,900 That your assumption of free will-- not 1218 00:56:18,900 --> 00:56:20,654 just for the legal system, but for, like, 1219 00:56:20,654 --> 00:56:22,320 getting out of bed in the morning, dude. 1220 00:56:22,320 --> 00:56:24,195 You've got to believe that you have free will 1221 00:56:24,195 --> 00:56:27,080 or it all falls apart. 1222 00:56:27,080 --> 00:56:30,020 I don't know. 1223 00:56:30,020 --> 00:56:31,490 AUDIENCE: --slave to your brain. 1224 00:56:31,490 --> 00:56:32,480 AUDIENCE: --robots. 1225 00:56:32,480 --> 00:56:32,810 ABBY NOYCE: So, what's the difference 1226 00:56:32,810 --> 00:56:34,638 between you and your brain? 1227 00:56:34,638 --> 00:56:36,195 AUDIENCE: I don't know. 1228 00:56:36,195 --> 00:56:37,570 ABBY NOYCE: Is there a difference 1229 00:56:37,570 --> 00:56:38,694 between you and your brain? 1230 00:56:38,694 --> 00:56:40,194 AUDIENCE: It's kind of squishy. 1231 00:56:40,194 --> 00:56:41,610 ABBY NOYCE: It's a good thing it's 1232 00:56:41,610 --> 00:56:43,510 inside that nice solid skull. 1233 00:56:43,510 --> 00:56:46,330 Is there a difference? 1234 00:56:46,330 --> 00:56:48,340 Part of the big question that comes down here-- 1235 00:56:48,340 --> 00:56:53,230 is there a difference between your self and your brain? 1236 00:56:53,230 --> 00:56:57,130 Is that a distinction that we can make validly?