1 00:00:00,090 --> 00:00:02,430 The following content is provided under a Creative 2 00:00:02,430 --> 00:00:03,850 Commons license. 3 00:00:03,850 --> 00:00:06,060 Your support will help MIT OpenCourseWare 4 00:00:06,060 --> 00:00:10,150 continue to offer high quality educational resources for free. 5 00:00:10,150 --> 00:00:12,690 To make a donation or to view additional materials 6 00:00:12,690 --> 00:00:16,620 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:16,620 --> 00:00:17,860 at ocw.mit.edu. 8 00:00:27,546 --> 00:00:29,670 ANDREW LO: Actually, at the end of class last time, 9 00:00:29,670 --> 00:00:34,020 we were talking about what happened with the stock 10 00:00:34,020 --> 00:00:37,440 prices of those four companies that 11 00:00:37,440 --> 00:00:41,730 were engaged in manufacturing parts for the space shuttle. 12 00:00:41,730 --> 00:00:44,370 And I didn't realize this, but apparently, 13 00:00:44,370 --> 00:00:46,800 in some of your other classes, I've 14 00:00:46,800 --> 00:00:50,580 been told that you actually did a case on Morton Thiokol, 15 00:00:50,580 --> 00:00:53,250 and the fact that there were some interesting discussions 16 00:00:53,250 --> 00:00:55,350 between the Morton Thiokol and NASA 17 00:00:55,350 --> 00:00:57,540 engineers about whether to launch. 18 00:00:57,540 --> 00:01:00,840 And that, in fact, when you just look at it 19 00:01:00,840 --> 00:01:05,180 from the perspective of the data, 20 00:01:05,180 --> 00:01:08,640 it would not have made sense to go ahead with the launch. 21 00:01:08,640 --> 00:01:11,220 Well, the point about efficient markets 22 00:01:11,220 --> 00:01:15,600 is that, while it may not be immediately apparent as to what 23 00:01:15,600 --> 00:01:19,800 all of the inputs are for coming up with a bottom line 24 00:01:19,800 --> 00:01:24,360 price for an asset, the fact is that enormous, enormous amounts 25 00:01:24,360 --> 00:01:29,100 of information are collected, collated, analyzed, 26 00:01:29,100 --> 00:01:32,280 and ultimately reflected in the price, 27 00:01:32,280 --> 00:01:35,940 to the point where, after a six month investigation, 28 00:01:35,940 --> 00:01:38,940 it was determined that Morton Thiokol was 29 00:01:38,940 --> 00:01:42,300 at fault for the explosion. 30 00:01:42,300 --> 00:01:46,950 But, in fact, within hours after the crash and literally minutes 31 00:01:46,950 --> 00:01:49,230 we see that the Morton Thiokol stock 32 00:01:49,230 --> 00:01:51,420 price was the one hit hardest. 33 00:01:51,420 --> 00:01:54,450 And even when you control for other factors like size 34 00:01:54,450 --> 00:01:58,560 and industry effects, you still have a pretty significant 35 00:01:58,560 --> 00:02:04,530 finger pointed at Morton Thiokol purely from the stock market. 36 00:02:04,530 --> 00:02:07,930 In a nutshell, this is what efficient markets is about. 37 00:02:07,930 --> 00:02:10,780 And for the longest time that is really 38 00:02:10,780 --> 00:02:14,560 what academic finance taught, and what 39 00:02:14,560 --> 00:02:17,580 was the basis for virtually everything that we do. 40 00:02:17,580 --> 00:02:19,990 So, in fact, all the material that we've 41 00:02:19,990 --> 00:02:24,520 covered in this course really assumes and requires 42 00:02:24,520 --> 00:02:27,130 that markets work well. 43 00:02:27,130 --> 00:02:31,240 That markets reflect all available information. 44 00:02:31,240 --> 00:02:34,720 Because obviously if it didn't, then when I tell you, 45 00:02:34,720 --> 00:02:37,260 go look at the market, I'm telling you to waste your time, 46 00:02:37,260 --> 00:02:37,760 right? 47 00:02:37,760 --> 00:02:40,150 Because why would you consult something 48 00:02:40,150 --> 00:02:44,290 that really didn't contain some significant informational 49 00:02:44,290 --> 00:02:46,250 advantages? 50 00:02:46,250 --> 00:02:47,660 That's the good news. 51 00:02:47,660 --> 00:02:50,360 That's the positive side of efficient markets. 52 00:02:50,360 --> 00:02:53,000 Over the last 15, 20 years, there 53 00:02:53,000 --> 00:02:55,040 have been more and more criticism 54 00:02:55,040 --> 00:02:58,510 leveled against this hypothesis of efficient markets. 55 00:02:58,510 --> 00:03:03,410 And the most easy, the most obvious and easiest way 56 00:03:03,410 --> 00:03:06,350 to make that point, is with a joke 57 00:03:06,350 --> 00:03:09,950 that you might have heard about economists. 58 00:03:09,950 --> 00:03:12,050 Two economists walking down the street. 59 00:03:12,050 --> 00:03:15,620 One of them sees a $100 bill lying on the ground 60 00:03:15,620 --> 00:03:18,800 and starts to pick it up when the other economist says, "Oh, 61 00:03:18,800 --> 00:03:20,720 don't bother picking it up." 62 00:03:20,720 --> 00:03:22,945 And then the first economist says, "Well, why not?" 63 00:03:22,945 --> 00:03:24,320 The second economist says, "Well, 64 00:03:24,320 --> 00:03:27,350 if that were a real, genuine $100 bill, 65 00:03:27,350 --> 00:03:28,850 someone would have already taken it, 66 00:03:28,850 --> 00:03:31,670 so it must be a counterfeit." 67 00:03:31,670 --> 00:03:32,880 Now you laugh at that. 68 00:03:32,880 --> 00:03:35,300 But that's exactly what efficient market says. 69 00:03:35,300 --> 00:03:38,930 It says that you can't make any money in markets 70 00:03:38,930 --> 00:03:42,650 because, if you could, someone would have already done it. 71 00:03:42,650 --> 00:03:45,860 Markets are so competitive that, in fact, you 72 00:03:45,860 --> 00:03:48,110 can't make any money. 73 00:03:48,110 --> 00:03:52,160 And after awhile, people started asking questions 74 00:03:52,160 --> 00:03:53,450 about this hypothesis. 75 00:03:53,450 --> 00:03:56,550 In particular, they would ask the question, well, 76 00:03:56,550 --> 00:04:00,140 if nobody can make any money, then why 77 00:04:00,140 --> 00:04:03,590 does anybody bother going into financial markets? 78 00:04:03,590 --> 00:04:06,080 If nobody makes any money, then who's 79 00:04:06,080 --> 00:04:09,200 the folks out there that is gathering information 80 00:04:09,200 --> 00:04:11,120 to make markets efficient? 81 00:04:11,120 --> 00:04:13,700 So first of all, there's a bit of a Zen paradox. 82 00:04:13,700 --> 00:04:17,300 And we'll talk about that Zen paradox a little bit later on. 83 00:04:17,300 --> 00:04:19,610 But the more substantial criticisms 84 00:04:19,610 --> 00:04:24,800 have to do with the fact that this presumption of efficiency 85 00:04:24,800 --> 00:04:28,670 requires rationality. 86 00:04:28,670 --> 00:04:31,970 In particular, it requires that market participants 87 00:04:31,970 --> 00:04:34,910 behave the way that economists would argue that they behave. 88 00:04:34,910 --> 00:04:39,770 So all of you are very gracious in not arguing with me 89 00:04:39,770 --> 00:04:44,320 that you would all want to hold the tangency portfolio. 90 00:04:44,320 --> 00:04:48,100 Now, you could have asked me at that time, well, 91 00:04:48,100 --> 00:04:50,320 why would I want to hold the tangency portfolio? 92 00:04:50,320 --> 00:04:55,270 I may happen to enjoy holding a small subset of stocks 93 00:04:55,270 --> 00:04:59,260 that I know, and love, and have a particular emotional 94 00:04:59,260 --> 00:05:00,940 attachment to. 95 00:05:00,940 --> 00:05:03,340 But you probably recognize that that wasn't where 96 00:05:03,340 --> 00:05:06,730 the class was going, so you were very gracious in not raising 97 00:05:06,730 --> 00:05:09,800 that as an issue, even though, in your heart of hearts, 98 00:05:09,800 --> 00:05:11,290 you might really prefer that. 99 00:05:11,290 --> 00:05:15,040 You might simply want to hold a few stocks that you know, 100 00:05:15,040 --> 00:05:18,710 and enjoy, and believe in, and so on and so forth. 101 00:05:18,710 --> 00:05:23,000 Well, that's an example of an irrationality from the point 102 00:05:23,000 --> 00:05:25,520 of view of the economist. 103 00:05:25,520 --> 00:05:27,500 And in a minute, I'm going to actually call 104 00:05:27,500 --> 00:05:32,420 into question what we mean by rationality altogether. 105 00:05:32,420 --> 00:05:34,790 So by the end of this lecture, I'm 106 00:05:34,790 --> 00:05:37,064 going to make a fairly strong prediction. 107 00:05:37,064 --> 00:05:38,480 I'm going to predict that what I'm 108 00:05:38,480 --> 00:05:41,240 going to tell you in the next hour and 15 minutes 109 00:05:41,240 --> 00:05:44,720 will change your lives permanently. 110 00:05:44,720 --> 00:05:46,690 That's a tall order. 111 00:05:46,690 --> 00:05:48,470 But you'll let me know. 112 00:05:48,470 --> 00:05:51,230 In five or 10 years, I want you to write me back and tell me 113 00:05:51,230 --> 00:05:52,370 whether or not I'm right. 114 00:05:52,370 --> 00:05:54,770 I think I am right, because I'm going 115 00:05:54,770 --> 00:05:59,912 to change the way all of you look at rationality completely. 116 00:05:59,912 --> 00:06:01,370 And actually we're going to do this 117 00:06:01,370 --> 00:06:03,140 in the context of your own behavior. 118 00:06:03,140 --> 00:06:05,020 All right? 119 00:06:05,020 --> 00:06:07,460 So let me tell you where we're going with this. 120 00:06:07,460 --> 00:06:09,830 Behavioral finance, the alternative 121 00:06:09,830 --> 00:06:14,540 to market efficiency, says that market participants are simply 122 00:06:14,540 --> 00:06:15,190 irrational. 123 00:06:15,190 --> 00:06:16,580 They don't make decisions the way 124 00:06:16,580 --> 00:06:19,460 that economists would presuppose that they do. 125 00:06:19,460 --> 00:06:24,320 In particular, they suffer from all sorts of biases, 126 00:06:24,320 --> 00:06:25,760 and they're listed here. 127 00:06:25,760 --> 00:06:27,470 Loss aversion, anchoring, framing. 128 00:06:27,470 --> 00:06:28,250 Overconfidence. 129 00:06:28,250 --> 00:06:29,420 Overreaction. 130 00:06:29,420 --> 00:06:30,260 Herding. 131 00:06:30,260 --> 00:06:31,820 Mental accounting. 132 00:06:31,820 --> 00:06:32,534 And so on. 133 00:06:32,534 --> 00:06:33,950 And some of you, I think, may have 134 00:06:33,950 --> 00:06:36,200 come across these in some of the other courses 135 00:06:36,200 --> 00:06:37,880 that you've taken. 136 00:06:37,880 --> 00:06:40,670 If you ever take an experimental psychology course, 137 00:06:40,670 --> 00:06:43,760 you would actually see these kinds of examples 138 00:06:43,760 --> 00:06:47,600 in pretty intimate detail. 139 00:06:47,600 --> 00:06:52,430 When you look down this list, and you take seriously what 140 00:06:52,430 --> 00:06:54,440 the psychologists are telling us, 141 00:06:54,440 --> 00:06:56,885 you would conclude that the human animal 142 00:06:56,885 --> 00:06:59,780 is among the stupidest creatures on earth 143 00:06:59,780 --> 00:07:03,530 because we engage in all of these kinds of biases. 144 00:07:03,530 --> 00:07:05,510 And I want to go through a few examples. 145 00:07:05,510 --> 00:07:07,310 Some of you may have already seen this 146 00:07:07,310 --> 00:07:09,290 because I actually gave these examples 147 00:07:09,290 --> 00:07:10,859 in my introductory talk on finance. 148 00:07:10,859 --> 00:07:13,400 And if you have, we're going to go through it pretty quickly. 149 00:07:13,400 --> 00:07:15,691 But if you haven't, I think you'll be kind of surprised 150 00:07:15,691 --> 00:07:17,180 by some of the outcomes. 151 00:07:17,180 --> 00:07:18,860 So the loss aversion example is one 152 00:07:18,860 --> 00:07:21,230 that I did give in my introductory lecture. 153 00:07:21,230 --> 00:07:24,320 How many people have seen this or remember this? 154 00:07:24,320 --> 00:07:25,020 OK. 155 00:07:25,020 --> 00:07:26,978 So I'm going to go through this pretty quickly. 156 00:07:26,978 --> 00:07:29,990 This is an example where clearly most people, when they're 157 00:07:29,990 --> 00:07:36,560 confronted with a choice of a certain gain versus a riskier 158 00:07:36,560 --> 00:07:39,920 alternative that might have even bigger gains 159 00:07:39,920 --> 00:07:44,390 but might also entail gaining nothing, 160 00:07:44,390 --> 00:07:48,800 that most everybody would pick choice A over B, as most of you 161 00:07:48,800 --> 00:07:52,440 did when we ran this example at the beginning of the semester. 162 00:07:52,440 --> 00:07:56,570 So even though B has a higher expected value, more risk, 163 00:07:56,570 --> 00:08:01,550 more expected return, like the CAPM says you should have, 164 00:08:01,550 --> 00:08:05,060 nevertheless, most of you as a personal preference 165 00:08:05,060 --> 00:08:08,610 would pick the less risky alternative, A. 166 00:08:08,610 --> 00:08:10,550 Now, when you're confronted with losses, 167 00:08:10,550 --> 00:08:14,090 as you recall from the example, most of you 168 00:08:14,090 --> 00:08:17,410 picked not the sure loss but rather 169 00:08:17,410 --> 00:08:22,850 the gamble that actually might cause you to lose even more 170 00:08:22,850 --> 00:08:25,550 but might also allow you to get back to even. 171 00:08:25,550 --> 00:08:29,390 So the reason that these two choices are so interesting 172 00:08:29,390 --> 00:08:32,690 is that, in the first case of A versus B, 173 00:08:32,690 --> 00:08:35,690 you picked the less risky alternative because you are 174 00:08:35,690 --> 00:08:38,539 being confronted with gains. 175 00:08:38,539 --> 00:08:41,570 But in this case, when you're being confronted with two 176 00:08:41,570 --> 00:08:46,400 losses, instead of being consistent and picking the less 177 00:08:46,400 --> 00:08:51,230 risky alternative-- which, by the way, is the sure loss C-- 178 00:08:51,230 --> 00:08:54,620 virtually all of you picked the gamble. 179 00:08:54,620 --> 00:08:57,080 You're willing to take more risk when 180 00:08:57,080 --> 00:09:02,450 it comes to losses than you are when it comes to gains. 181 00:09:02,450 --> 00:09:04,730 That's an asymmetry that frankly doesn't 182 00:09:04,730 --> 00:09:07,850 jibe with modern financial theory, 183 00:09:07,850 --> 00:09:12,110 with utility maximization, with efficient markets. 184 00:09:12,110 --> 00:09:15,260 So the reason that these two sets of choices 185 00:09:15,260 --> 00:09:18,320 are so interesting when juxtaposed together 186 00:09:18,320 --> 00:09:23,030 is that the choices that most of you did pick, A and D, 187 00:09:23,030 --> 00:09:27,050 was actually strictly dominated by the choices that most of you 188 00:09:27,050 --> 00:09:31,730 did not pick, which is C and B. When you combine these two, 189 00:09:31,730 --> 00:09:34,580 you can show that B and C is exactly equal 190 00:09:34,580 --> 00:09:38,600 to the most popular choices, A and D, plus $10,000 cash. 191 00:09:38,600 --> 00:09:41,950 So you guys left $10,000 lying on the sidewalk. 192 00:09:41,950 --> 00:09:45,620 And that's clearly not efficient. 193 00:09:45,620 --> 00:09:51,590 Well, it turns out that there is a reason for this kind of bias, 194 00:09:51,590 --> 00:09:55,770 and it's hardwired into all of you. 195 00:09:55,770 --> 00:09:58,310 And I'm going to tell you what that is in a few minutes, 196 00:09:58,310 --> 00:10:00,500 after I go through just a couple of other examples 197 00:10:00,500 --> 00:10:03,170 to make sure that we all understand 198 00:10:03,170 --> 00:10:07,490 the limitations of what we consider to be rational, 199 00:10:07,490 --> 00:10:10,020 economic decision-making. 200 00:10:10,020 --> 00:10:11,860 Any questions about this example? 201 00:10:11,860 --> 00:10:13,230 This is called loss aversion. 202 00:10:13,230 --> 00:10:14,850 This is an example that was first 203 00:10:14,850 --> 00:10:17,910 proposed by two psychologists, Kahneman and Tversky, 204 00:10:17,910 --> 00:10:21,390 and for which Kahneman won the Nobel Prize in economics 205 00:10:21,390 --> 00:10:23,532 a couple of years ago. 206 00:10:23,532 --> 00:10:25,740 Tversky would have won it as well, had he been alive, 207 00:10:25,740 --> 00:10:27,700 but he died a few years before. 208 00:10:27,700 --> 00:10:29,400 And what's interesting about the fact 209 00:10:29,400 --> 00:10:31,020 that Kahneman won the Nobel Prize is 210 00:10:31,020 --> 00:10:34,590 that he's not an economist, he's a psychologist. 211 00:10:34,590 --> 00:10:38,460 And so, in a way, maybe the Nobel Memorial Prize 212 00:10:38,460 --> 00:10:42,049 in Economics was trying to signal to the economics 213 00:10:42,049 --> 00:10:44,340 profession that they ought to take these kind of things 214 00:10:44,340 --> 00:10:46,080 more seriously. 215 00:10:46,080 --> 00:10:46,830 OK. 216 00:10:46,830 --> 00:10:48,270 This is the second example that I 217 00:10:48,270 --> 00:10:50,340 gave in that introductory lecture 218 00:10:50,340 --> 00:10:52,710 on drawing balls out of an urn. 219 00:10:52,710 --> 00:10:55,720 How many people still remember this example? 220 00:10:55,720 --> 00:10:56,490 No? 221 00:10:56,490 --> 00:10:59,040 OK, so let me go through this in a little bit more detail 222 00:10:59,040 --> 00:11:00,930 because this is directly relevant 223 00:11:00,930 --> 00:11:04,184 to the current financial crisis that we're facing. 224 00:11:04,184 --> 00:11:06,600 Those of you who've ever taken an undergraduate statistics 225 00:11:06,600 --> 00:11:09,025 course, you'll remember the bane of your existence 226 00:11:09,025 --> 00:11:09,900 during that semester. 227 00:11:09,900 --> 00:11:12,340 It was the prototypical urn example, right? 228 00:11:12,340 --> 00:11:13,980 Drawing balls out of an urn. 229 00:11:13,980 --> 00:11:15,250 I hated those examples. 230 00:11:15,250 --> 00:11:16,836 They're a real pain in the neck. 231 00:11:16,836 --> 00:11:18,210 So I'm going to give you one now. 232 00:11:18,210 --> 00:11:21,270 But this one is not as bad as the others 233 00:11:21,270 --> 00:11:23,070 that you might have come across. 234 00:11:23,070 --> 00:11:25,620 I want you to imagine an urn that I have 235 00:11:25,620 --> 00:11:27,720 where I've got 100 balls. 236 00:11:27,720 --> 00:11:29,979 And 50 of those balls are colored red, 237 00:11:29,979 --> 00:11:31,270 the other 50 are colored black. 238 00:11:31,270 --> 00:11:33,830 They're otherwise identical. 239 00:11:33,830 --> 00:11:36,780 And you and I, we're going to play a game. 240 00:11:36,780 --> 00:11:38,180 And here's the game. 241 00:11:38,180 --> 00:11:39,749 You pick a color, red or black. 242 00:11:39,749 --> 00:11:40,790 Don't tell me what it is. 243 00:11:40,790 --> 00:11:42,680 Write it down on a piece of paper. 244 00:11:42,680 --> 00:11:46,130 And then I'm going to draw a ball out of this urn. 245 00:11:46,130 --> 00:11:49,740 And if the ball that I draw is your color, 246 00:11:49,740 --> 00:11:52,430 I'm going to pay you $10,000. 247 00:11:52,430 --> 00:11:56,413 And if it's not, I'm going to pay you nothing. 248 00:11:56,413 --> 00:11:57,740 OK? 249 00:11:57,740 --> 00:11:59,360 That's the game. 250 00:11:59,360 --> 00:12:02,600 And we're going to do this exactly once. 251 00:12:02,600 --> 00:12:04,400 Play this game once. 252 00:12:04,400 --> 00:12:06,900 I wish we could do this in reality. 253 00:12:06,900 --> 00:12:09,050 But the prize is a little bit higher 254 00:12:09,050 --> 00:12:13,100 than what our dean is willing to let me gamble with. 255 00:12:13,100 --> 00:12:16,640 But let's imagine we're going to do this for real. 256 00:12:16,640 --> 00:12:17,540 OK? 257 00:12:17,540 --> 00:12:21,210 So I want you to tell me, first of all, 258 00:12:21,210 --> 00:12:24,290 how many people would be willing to play this game with me 259 00:12:24,290 --> 00:12:25,790 for $1? 260 00:12:25,790 --> 00:12:28,080 In other words, you pay me $1 and we'll play this-- 261 00:12:28,080 --> 00:12:28,620 Yeah, OK. 262 00:12:28,620 --> 00:12:29,190 Good. 263 00:12:29,190 --> 00:12:31,620 How about $1,000? 264 00:12:31,620 --> 00:12:34,290 $2,000? 265 00:12:34,290 --> 00:12:35,820 $3,000? 266 00:12:35,820 --> 00:12:37,740 $4,000? 267 00:12:37,740 --> 00:12:40,200 $4,500? 268 00:12:40,200 --> 00:12:43,320 $4,999? 269 00:12:43,320 --> 00:12:45,900 $5,010? 270 00:12:45,900 --> 00:12:46,800 OK. 271 00:12:46,800 --> 00:12:49,410 So I still had a few hands up at $4,999. 272 00:12:49,410 --> 00:12:53,010 Nobody wants to play this with me for $5,010, 273 00:12:53,010 --> 00:12:54,510 presumably because? 274 00:12:54,510 --> 00:12:55,353 Louis? 275 00:12:55,353 --> 00:12:57,492 AUDIENCE: So if you asked the question, 276 00:12:57,492 --> 00:12:59,700 how many people would sit down at a blackjack table-- 277 00:12:59,700 --> 00:13:00,066 ANDREW LO: Yeah. 278 00:13:00,066 --> 00:13:01,357 AUDIENCE: Or a roulette wheel-- 279 00:13:01,357 --> 00:13:02,174 ANDREW LO: Yeah. 280 00:13:02,174 --> 00:13:03,550 AUDIENCE: I think people would raise their hand. 281 00:13:03,550 --> 00:13:04,292 ANDREW LO: Mm-hmm. 282 00:13:04,292 --> 00:13:05,422 AUDIENCE: But they won't pay $5,010 283 00:13:05,422 --> 00:13:07,010 to play this game of red black? 284 00:13:07,010 --> 00:13:08,390 ANDREW LO: Well, that's a different question though, 285 00:13:08,390 --> 00:13:09,160 right? 286 00:13:09,160 --> 00:13:10,040 AUDIENCE: This is a roulette wheel. 287 00:13:10,040 --> 00:13:10,950 ANDREW LO: Well, but it's-- 288 00:13:10,950 --> 00:13:11,810 AUDIENCE: [INAUDIBLE] house money. 289 00:13:11,810 --> 00:13:12,980 ANDREW LO: --not a roulette wheel though. 290 00:13:12,980 --> 00:13:14,330 It's a different game. 291 00:13:14,330 --> 00:13:17,100 It's an urn with 50 balls red and 50 black. 292 00:13:17,100 --> 00:13:19,022 AUDIENCE: [LAUGHTER] 293 00:13:19,022 --> 00:13:20,480 ANDREW LO: I mean, you know, I have 294 00:13:20,480 --> 00:13:22,850 to tell you that a roulette wheel happens 295 00:13:22,850 --> 00:13:23,869 to be more entertaining. 296 00:13:23,869 --> 00:13:25,910 You know, you see the little ball rolling around, 297 00:13:25,910 --> 00:13:28,495 and it's a different game. 298 00:13:28,495 --> 00:13:30,620 All right, I want to come back to this in a minute. 299 00:13:30,620 --> 00:13:32,600 But let's get back to the price. 300 00:13:32,600 --> 00:13:35,330 I established the price of $5,000. 301 00:13:35,330 --> 00:13:37,520 And, not surprisingly, all of you 302 00:13:37,520 --> 00:13:39,530 extremely numerate MIT students, you 303 00:13:39,530 --> 00:13:42,890 figured out that the expected value is $5,000, right? 304 00:13:42,890 --> 00:13:45,080 And so that's what you bid. 305 00:13:45,080 --> 00:13:46,290 OK. 306 00:13:46,290 --> 00:13:49,494 Now before I go on, let me ask you one other question. 307 00:13:49,494 --> 00:13:51,660 Do you have a preference for which color you'd pick? 308 00:13:54,480 --> 00:13:56,280 No preference? 309 00:13:56,280 --> 00:13:57,600 You sure? 310 00:13:57,600 --> 00:13:59,550 Why not? 311 00:13:59,550 --> 00:14:02,020 No preference? 312 00:14:02,020 --> 00:14:02,820 AUDIENCE: It's 50%. 313 00:14:02,820 --> 00:14:04,320 ANDREW LO: It's 50/50, that's right. 314 00:14:04,320 --> 00:14:06,090 So there's no reason that you have 315 00:14:06,090 --> 00:14:07,710 a preference one or the other. 316 00:14:07,710 --> 00:14:09,480 However, I should just let you know 317 00:14:09,480 --> 00:14:15,080 that in populations where they do this kind of survey, 318 00:14:15,080 --> 00:14:17,820 there is a slight preference for the color red 319 00:14:17,820 --> 00:14:19,479 in the population. 320 00:14:19,479 --> 00:14:20,520 It's kind of interesting. 321 00:14:20,520 --> 00:14:22,680 But, you know, all right, fine, 50/50. 322 00:14:22,680 --> 00:14:23,970 You understand the game. 323 00:14:23,970 --> 00:14:26,040 There's no preference. 324 00:14:26,040 --> 00:14:29,220 OK, now, we're going to play the exact same game 325 00:14:29,220 --> 00:14:35,230 except with urn B. Urn B is identical to urn 326 00:14:35,230 --> 00:14:39,480 A in every respect except that, in this case, 327 00:14:39,480 --> 00:14:42,050 I'm not going to tell you the proportion. 328 00:14:42,050 --> 00:14:45,630 There could be 50/50 red and black, 329 00:14:45,630 --> 00:14:54,420 but it could be 40/60, or 70/30, or 85/15, or 100/0. 330 00:14:54,420 --> 00:14:59,040 I will promise you that there are exactly 100 balls 331 00:14:59,040 --> 00:15:02,580 and there are, at most, two colors, red and/or black, 332 00:15:02,580 --> 00:15:07,090 but beyond that, I will not tell you anything. 333 00:15:07,090 --> 00:15:10,140 And now we're going to play the exact same game. 334 00:15:10,140 --> 00:15:11,020 OK? 335 00:15:11,020 --> 00:15:14,170 So the question is, first of all, what color 336 00:15:14,170 --> 00:15:15,840 would you prefer? 337 00:15:15,840 --> 00:15:18,360 Any of you have any particular preference here? 338 00:15:18,360 --> 00:15:19,200 Why not? 339 00:15:19,200 --> 00:15:19,896 AUDIENCE: Red. 340 00:15:19,896 --> 00:15:20,520 ANDREW LO: Red. 341 00:15:20,520 --> 00:15:22,170 You'd prefer red? 342 00:15:22,170 --> 00:15:22,890 OK, yeah. 343 00:15:22,890 --> 00:15:23,910 AUDIENCE: Go for black because you know 344 00:15:23,910 --> 00:15:25,196 that most people prefer red. 345 00:15:25,196 --> 00:15:27,570 ANDREW LO: Well, no, I didn't say most people prefer red. 346 00:15:27,570 --> 00:15:29,005 I said there was a slight preference for red, 347 00:15:29,005 --> 00:15:31,470 and, Ingrid, you seem to fall into that category. 348 00:15:31,470 --> 00:15:33,050 And so you'd prefer black instead. 349 00:15:33,050 --> 00:15:33,550 All right. 350 00:15:33,550 --> 00:15:36,081 Well, we're going to come back to the color later on. 351 00:15:36,081 --> 00:15:37,830 But now let me ask you, how much would you 352 00:15:37,830 --> 00:15:39,600 pay to play this game with me? 353 00:15:39,600 --> 00:15:41,110 Everything else is the same. 354 00:15:41,110 --> 00:15:45,120 $10,000 prize if you get the color that I pick. 355 00:15:45,120 --> 00:15:45,927 All right? 356 00:15:45,927 --> 00:15:47,010 AUDIENCE: Do we trust you? 357 00:15:47,010 --> 00:15:48,306 ANDREW LO: What's that? 358 00:15:48,306 --> 00:15:49,620 AUDIENCE: Can we trust you? 359 00:15:49,620 --> 00:15:50,010 ANDREW LO: I don't know. 360 00:15:50,010 --> 00:15:50,722 Do you? 361 00:15:50,722 --> 00:15:51,650 AUDIENCE: [LAUGHTER] 362 00:15:51,650 --> 00:15:53,900 ANDREW LO: Well, first of all, what is there to trust? 363 00:15:53,900 --> 00:15:55,399 AUDIENCE: Is there really a hundred? 364 00:15:55,399 --> 00:15:57,480 ANDREW LO: I, yes, that I will guarantee. 365 00:15:57,480 --> 00:16:00,750 So after the fact, that we will verify that there is exactly 366 00:16:00,750 --> 00:16:03,510 100 balls and it will be, at most, 367 00:16:03,510 --> 00:16:06,000 two colors, red and/or black, and that's it. 368 00:16:06,000 --> 00:16:08,171 No other elements will be introduced. 369 00:16:08,171 --> 00:16:08,670 OK? 370 00:16:08,670 --> 00:16:11,880 So that's perfectly verifiable, and we will verify that 371 00:16:11,880 --> 00:16:13,440 after the fact, that all prices are 372 00:16:13,440 --> 00:16:15,870 contingent upon that verification. 373 00:16:15,870 --> 00:16:18,720 But that's the extent to which you need to trust me. 374 00:16:18,720 --> 00:16:21,210 If you're asking whether or not I'm 375 00:16:21,210 --> 00:16:24,210 the one that's picking the distribution, then, yes, I am. 376 00:16:24,210 --> 00:16:25,810 I'm picking the distribution. 377 00:16:25,810 --> 00:16:27,420 But I don't communicate with you, 378 00:16:27,420 --> 00:16:29,820 and you don't communicate with me ahead of time, 379 00:16:29,820 --> 00:16:33,160 other than what I know about you in this class 380 00:16:33,160 --> 00:16:35,370 and over the last semester. 381 00:16:35,370 --> 00:16:36,630 OK? 382 00:16:36,630 --> 00:16:38,610 So now, let's figure out how much 383 00:16:38,610 --> 00:16:41,096 this will draw in terms of the price. 384 00:16:41,096 --> 00:16:43,470 How many people will be willing to play this game with me 385 00:16:43,470 --> 00:16:46,150 once for $1? 386 00:16:46,150 --> 00:16:48,580 $1,000? 387 00:16:48,580 --> 00:16:50,650 $2,000? 388 00:16:50,650 --> 00:16:52,040 $3,000? 389 00:16:52,040 --> 00:16:54,190 $4,000? 390 00:16:54,190 --> 00:16:56,890 $4,999? 391 00:16:56,890 --> 00:16:59,320 $5,010? 392 00:16:59,320 --> 00:17:00,460 OK. 393 00:17:00,460 --> 00:17:01,280 So it's about the-- 394 00:17:01,280 --> 00:17:03,522 You'd pay more? 395 00:17:03,522 --> 00:17:05,730 You'd pay more to play this game than the one before? 396 00:17:08,390 --> 00:17:11,099 I'm going to come back to that in a minute. 397 00:17:11,099 --> 00:17:14,290 That's worth exploring. 398 00:17:14,290 --> 00:17:16,390 But before I get back to Davide, Justin, 399 00:17:16,390 --> 00:17:19,500 you were willing to pay the same for both. 400 00:17:19,500 --> 00:17:20,364 Why? 401 00:17:20,364 --> 00:17:22,497 AUDIENCE: Well, because the proportion 402 00:17:22,497 --> 00:17:25,109 doesn't matter because you pick your color, 403 00:17:25,109 --> 00:17:27,570 so it's kind of out of your hands. 404 00:17:27,570 --> 00:17:29,850 You know it's either one color or the other color. 405 00:17:29,850 --> 00:17:31,628 You know the proportion's already 406 00:17:31,628 --> 00:17:34,216 been decided upon so it should be 407 00:17:34,216 --> 00:17:35,850 the same as the previous game. 408 00:17:35,850 --> 00:17:38,710 ANDREW LO: OK, so you think that the odds are the same 409 00:17:38,710 --> 00:17:41,050 in this case as before? 410 00:17:41,050 --> 00:17:43,832 Even though I'm the one that picks the odds? 411 00:17:43,832 --> 00:17:46,664 AUDIENCE: When you pick it beforehand 412 00:17:46,664 --> 00:17:48,674 and, I guess if it was random-- 413 00:17:48,674 --> 00:17:49,840 ANDREW LO: Well, no, no, no. 414 00:17:49,840 --> 00:17:51,470 It's exactly what I just told you. 415 00:17:51,470 --> 00:17:54,475 I get to pick the odds so, unless you consider me random-- 416 00:17:54,475 --> 00:17:56,600 AUDIENCE: Yeah, but we still get to pick the ball-- 417 00:17:56,600 --> 00:17:57,790 ANDREW LO: You get to pick the ball. 418 00:17:57,790 --> 00:17:59,623 You have to pick the color, I pick the ball. 419 00:17:59,623 --> 00:18:02,430 AUDIENCE: Yeah, I still think it's about 50/50. 420 00:18:02,430 --> 00:18:03,300 ANDREW LO: 50/50. 421 00:18:03,300 --> 00:18:03,840 OK. 422 00:18:03,840 --> 00:18:04,060 Mike? 423 00:18:04,060 --> 00:18:06,710 AUDIENCE: Do you pick the odds after we tell you our thoughts? 424 00:18:06,710 --> 00:18:09,971 ANDREW LO: No, it's clearly not. 425 00:18:09,971 --> 00:18:10,470 Yeah. 426 00:18:10,470 --> 00:18:11,260 No, no. 427 00:18:11,260 --> 00:18:12,580 That's a good point. 428 00:18:12,580 --> 00:18:14,840 And, no, we won't cheat, in the sense 429 00:18:14,840 --> 00:18:17,930 that I will not pick the distribution after you tell me 430 00:18:17,930 --> 00:18:20,220 what your color is. 431 00:18:20,220 --> 00:18:20,765 OK. 432 00:18:20,765 --> 00:18:21,590 Dannon? 433 00:18:21,590 --> 00:18:23,715 AUDIENCE: [INAUDIBLE] I mean, you really don't know 434 00:18:23,715 --> 00:18:24,820 the probability, so-- 435 00:18:24,820 --> 00:18:25,489 ANDREW LO: Yeah. 436 00:18:25,489 --> 00:18:27,530 AUDIENCE: You really need to have one black ball? 437 00:18:27,530 --> 00:18:30,321 Since you're drawing, you know, one more person 438 00:18:30,321 --> 00:18:31,446 who has to match the color. 439 00:18:31,446 --> 00:18:32,892 Then you have another 99 that are left. 440 00:18:32,892 --> 00:18:35,058 So your probability's are really high at that point. 441 00:18:35,058 --> 00:18:37,595 Unless you're very unlucky, with that very first ball 442 00:18:37,595 --> 00:18:39,340 as being black, then-- 443 00:18:39,340 --> 00:18:41,647 So, but it could go either way. 444 00:18:41,647 --> 00:18:42,230 ANDREW LO: OK. 445 00:18:42,230 --> 00:18:44,320 So I'm-- So what's the bottom line? 446 00:18:44,320 --> 00:18:47,337 Do you think it's the same as before, or it's different? 447 00:18:47,337 --> 00:18:49,670 Would you bid the same amount to play this game with me? 448 00:18:49,670 --> 00:18:50,620 AUDIENCE: I ended up bidding the same amount. 449 00:18:50,620 --> 00:18:51,411 ANDREW LO: You did? 450 00:18:51,411 --> 00:18:51,940 OK. 451 00:18:51,940 --> 00:18:54,040 Anybody disagree with that? 452 00:18:54,040 --> 00:18:55,660 Anybody that would not pay the same? 453 00:18:55,660 --> 00:18:57,430 A lot of you didn't pay the same. 454 00:18:57,430 --> 00:18:58,554 Yeah, Jen? 455 00:18:58,554 --> 00:19:00,762 AUDIENCE: Well, I would just calculate that the worst 456 00:19:00,762 --> 00:19:02,072 outcome here is worse. 457 00:19:02,072 --> 00:19:03,020 It's like $100. 458 00:19:03,020 --> 00:19:04,270 ANDREW LO: Yeah, that's right. 459 00:19:04,270 --> 00:19:05,396 AUDIENCE: I would pay $100. 460 00:19:05,396 --> 00:19:06,936 ANDREW LO: That is the worst outcome. 461 00:19:06,936 --> 00:19:09,640 I mean, if you pick red, I could have 100 black balls in there, 462 00:19:09,640 --> 00:19:12,138 and then you're going to lose, no matter what. 463 00:19:12,138 --> 00:19:13,596 AUDIENCE: But that's also the best. 464 00:19:13,596 --> 00:19:14,096 Yeah. 465 00:19:14,096 --> 00:19:14,900 Pick black. 466 00:19:14,900 --> 00:19:16,120 Because it's-- [INAUDIBLE] 467 00:19:16,120 --> 00:19:17,360 ANDREW LO: That's true, yes. 468 00:19:17,360 --> 00:19:18,901 If you look at it the other way, it's 469 00:19:18,901 --> 00:19:22,170 the best case if you did the opposite. 470 00:19:22,170 --> 00:19:22,780 Hmm. 471 00:19:22,780 --> 00:19:25,080 So what do we make of that? 472 00:19:25,080 --> 00:19:25,880 Yeah? 473 00:19:25,880 --> 00:19:27,270 AUDIENCE: I think basically it's, 474 00:19:27,270 --> 00:19:29,290 the laws of standard probability are the same. 475 00:19:29,290 --> 00:19:29,956 ANDREW LO: Yeah. 476 00:19:29,956 --> 00:19:31,955 AUDIENCE: Because, in this case, the probability 477 00:19:31,955 --> 00:19:34,920 is you all being able to pick the right ball. 478 00:19:34,920 --> 00:19:35,962 And that's 50/50. 479 00:19:35,962 --> 00:19:36,920 The right color, right? 480 00:19:36,920 --> 00:19:37,628 ANDREW LO: Is it? 481 00:19:37,628 --> 00:19:38,920 Is it 50/50 though? 482 00:19:38,920 --> 00:19:39,850 That's the question. 483 00:19:39,850 --> 00:19:40,360 Yeah. 484 00:19:40,360 --> 00:19:42,919 I'm picking the distribution, you're picking the ball, right? 485 00:19:42,919 --> 00:19:43,460 Yeah, Davide? 486 00:19:43,460 --> 00:19:44,700 AUDIENCE: I'm not sure I understand exactly. 487 00:19:44,700 --> 00:19:45,440 So-- 488 00:19:45,440 --> 00:19:48,980 AUDIENCE: [LAUGHTER] 489 00:19:48,980 --> 00:19:50,720 ANDREW LO: That would explain, certainly. 490 00:19:50,720 --> 00:19:52,271 Sure. 491 00:19:52,271 --> 00:19:52,770 Go ahead. 492 00:19:52,770 --> 00:19:54,220 AUDIENCE: Say the real proportion is 50/50, 493 00:19:54,220 --> 00:19:55,390 I would pay you $5,000. 494 00:19:55,390 --> 00:19:56,105 ANDREW LO: Sure. 495 00:19:56,105 --> 00:19:58,490 AUDIENCE: If the real proportion is zero, 496 00:19:58,490 --> 00:19:59,855 I wouldn't pay you any more. 497 00:19:59,855 --> 00:20:02,451 If the real proportion is one, I wouldn't pay you any more. 498 00:20:02,451 --> 00:20:03,700 ANDREW LO: Yeah, that's right. 499 00:20:03,700 --> 00:20:04,200 Sure. 500 00:20:04,200 --> 00:20:05,860 If you knew that this was-- if you 501 00:20:05,860 --> 00:20:08,890 had an insider's view of what the proportion is, 502 00:20:08,890 --> 00:20:09,760 you might pay more. 503 00:20:09,760 --> 00:20:11,680 AUDIENCE: So, in other words, I would pay more than $5,000. 504 00:20:11,680 --> 00:20:13,650 ANDREW LO: OK, but do you think you know something then? 505 00:20:13,650 --> 00:20:15,502 AUDIENCE: No, but I would say that I 506 00:20:15,502 --> 00:20:18,005 would pay more than $5,000. 507 00:20:18,005 --> 00:20:18,630 ANDREW LO: Why? 508 00:20:18,630 --> 00:20:19,129 Why? 509 00:20:19,129 --> 00:20:20,680 Only if you had more information. 510 00:20:20,680 --> 00:20:21,405 But do you? 511 00:20:21,405 --> 00:20:23,030 Do you think you have more information? 512 00:20:23,030 --> 00:20:24,870 AUDIENCE: Even if it's less than 50/50, 513 00:20:24,870 --> 00:20:26,720 then I would pay more than $5,000. 514 00:20:26,720 --> 00:20:29,721 ANDREW LO: But what if it's more than 50/50? 515 00:20:29,721 --> 00:20:31,345 What if it's more than, in other words, 516 00:20:31,345 --> 00:20:36,440 what if it's 60/40 against you, as opposed to 40/60 with you? 517 00:20:36,440 --> 00:20:37,870 AUDIENCE: Because at 60/40-- 518 00:20:37,870 --> 00:20:39,100 I mean, it's like-- 519 00:20:39,100 --> 00:20:42,801 as if it were 100/0, then I would be winning. 520 00:20:42,801 --> 00:20:43,300 Sure. 521 00:20:43,300 --> 00:20:44,883 ANDREW LO: That depends on your color. 522 00:20:44,883 --> 00:20:46,180 It depends on your color. 523 00:20:46,180 --> 00:20:47,799 Yeah, David? 524 00:20:47,799 --> 00:20:50,400 AUDIENCE: If there's no expectation that you're biased, 525 00:20:50,400 --> 00:20:52,984 you can't make that expectation and the expectation of how 526 00:20:52,984 --> 00:20:54,960 much you could get is the same. 527 00:20:54,960 --> 00:20:55,735 ANDREW LO: OK. 528 00:20:55,735 --> 00:20:57,960 AUDIENCE: So if you're going just on the expectation 529 00:20:57,960 --> 00:21:00,030 of what your bias is-- 530 00:21:00,030 --> 00:21:00,910 ANDREW LO: Right. 531 00:21:00,910 --> 00:21:03,880 Well, so you guys have seen me over the last 12 weeks. 532 00:21:03,880 --> 00:21:05,800 You know what my biases are. 533 00:21:05,800 --> 00:21:08,260 In this case, I've already told you 534 00:21:08,260 --> 00:21:09,951 that I have some information. 535 00:21:09,951 --> 00:21:10,450 Right? 536 00:21:10,450 --> 00:21:14,200 That people, on average, prefer red a little bit more 537 00:21:14,200 --> 00:21:15,220 than black. 538 00:21:15,220 --> 00:21:18,220 So actually, you have that piece of information. 539 00:21:18,220 --> 00:21:19,510 What would you do with it? 540 00:21:19,510 --> 00:21:22,965 Does that tell you anything? 541 00:21:22,965 --> 00:21:24,340 AUDIENCE: It tells us everything. 542 00:21:24,340 --> 00:21:26,480 ANDREW LO: Well, it does tell you something. 543 00:21:26,480 --> 00:21:29,470 It tells you that I know that people prefer 544 00:21:29,470 --> 00:21:32,020 red slightly more than black. 545 00:21:32,020 --> 00:21:38,290 But the problem with that is that you know I know. 546 00:21:38,290 --> 00:21:40,780 And moreover, I know you know I know, 547 00:21:40,780 --> 00:21:43,960 and you know I know you know I know. 548 00:21:43,960 --> 00:21:47,710 When you take it back to its infinite regression, 549 00:21:47,710 --> 00:21:49,710 you know what it comes up to? 550 00:21:49,710 --> 00:21:52,570 50/50. 551 00:21:52,570 --> 00:21:54,610 It turns out that, mathematically, 552 00:21:54,610 --> 00:21:57,970 the odds for this game are literally 553 00:21:57,970 --> 00:22:01,080 identical to the odds of the previous game. 554 00:22:01,080 --> 00:22:05,460 But it's also the case that when you take these two urns 555 00:22:05,460 --> 00:22:08,460 and you auction them off to the public, 556 00:22:08,460 --> 00:22:11,640 there is a very, very significant discount 557 00:22:11,640 --> 00:22:16,950 in what people will pay for urn B. And we saw this in class. 558 00:22:16,950 --> 00:22:19,230 A lot fewer of you raised your hands 559 00:22:19,230 --> 00:22:23,512 we were going to $3,000, $4,000, and $4,999. 560 00:22:23,512 --> 00:22:24,220 And you know why? 561 00:22:24,220 --> 00:22:26,380 When you ask people, why is it that you 562 00:22:26,380 --> 00:22:31,120 would pay less for this urn than the previous one, 563 00:22:31,120 --> 00:22:35,400 they'll tell you very plainly it's because, in this case, 564 00:22:35,400 --> 00:22:37,614 I don't know what the odds are. 565 00:22:37,614 --> 00:22:39,530 I know what the odds are in the previous case, 566 00:22:39,530 --> 00:22:44,790 but I don't know what kind of risks I'm dealing with. 567 00:22:44,790 --> 00:22:47,310 In most dictionaries and thesauruses, 568 00:22:47,310 --> 00:22:50,560 risk and uncertainty are considered synonyms. 569 00:22:50,560 --> 00:22:53,950 But in fact, we see here that, from a decision-making point 570 00:22:53,950 --> 00:22:58,810 of view, when there is uncertainty about your risk, 571 00:22:58,810 --> 00:23:02,170 you tend to shy away from that because I don't know 572 00:23:02,170 --> 00:23:04,150 what I'm dealing with here. 573 00:23:04,150 --> 00:23:06,130 Now there is one simple mechanism 574 00:23:06,130 --> 00:23:12,040 by which all of you in this room could make it impossible 575 00:23:12,040 --> 00:23:17,020 for me to somehow strategically pick the worst 576 00:23:17,020 --> 00:23:20,180 distribution against you. 577 00:23:20,180 --> 00:23:22,331 Anybody tell me what that algorithm is? 578 00:23:22,331 --> 00:23:22,830 Yeah? 579 00:23:22,830 --> 00:23:24,330 AUDIENCE: If you just flipped a coin 580 00:23:24,330 --> 00:23:25,910 to decide what color you pick? 581 00:23:25,910 --> 00:23:27,080 ANDREW LO: Exactly. 582 00:23:27,080 --> 00:23:31,610 If Brian flips his coin, not my coin, if you take your coin 583 00:23:31,610 --> 00:23:35,730 and flip it and pick red or black based upon the outcome, 584 00:23:35,730 --> 00:23:38,510 there's no way I could possibly do anything 585 00:23:38,510 --> 00:23:40,520 to disadvantage you. 586 00:23:40,520 --> 00:23:43,040 Now, that's a real interesting conclusion, isn't it? 587 00:23:43,040 --> 00:23:44,510 Because what that's telling you is 588 00:23:44,510 --> 00:23:48,950 that by injecting uncertainty into a system, 589 00:23:48,950 --> 00:23:53,030 you might actually make it more preferable, more fair, 590 00:23:53,030 --> 00:23:55,537 by adding randomness in there. 591 00:23:55,537 --> 00:23:57,370 And yet when you tell people that, they say, 592 00:23:57,370 --> 00:24:00,940 well, you know that sounds great, I still don't want this. 593 00:24:00,940 --> 00:24:04,290 Because I don't want to deal with the uncertainty. 594 00:24:04,290 --> 00:24:06,500 And that's hardwired. 595 00:24:06,500 --> 00:24:11,290 We as human beings shy away from the unknown. 596 00:24:11,290 --> 00:24:14,740 OK, now, take this example and apply it 597 00:24:14,740 --> 00:24:17,860 to a non-financial corporation that 598 00:24:17,860 --> 00:24:21,420 has toxic assets as part of its balance sheet 599 00:24:21,420 --> 00:24:24,160 where you have no idea what the market value is, because there 600 00:24:24,160 --> 00:24:26,140 is no market, because it's not trading. 601 00:24:26,140 --> 00:24:28,330 Because you don't know how to evaluate the security 602 00:24:28,330 --> 00:24:31,030 because it contains too many parts that you can't figure out 603 00:24:31,030 --> 00:24:34,150 what the legal, never mind financial, 604 00:24:34,150 --> 00:24:37,189 ramifications are for those assets. 605 00:24:37,189 --> 00:24:39,730 And you can understand now why people say, oh, wait a minute, 606 00:24:39,730 --> 00:24:40,780 I don't want to play. 607 00:24:40,780 --> 00:24:43,330 I don't I don't care what it's worth. 608 00:24:43,330 --> 00:24:44,410 I don't want to know. 609 00:24:44,410 --> 00:24:46,750 I don't want to deal with it because I 610 00:24:46,750 --> 00:24:49,480 can't evaluate the risk. 611 00:24:49,480 --> 00:24:52,480 That's what's happening in spades today. 612 00:24:52,480 --> 00:24:53,580 [? Shlomi? ?] 613 00:24:53,580 --> 00:24:54,960 AUDIENCE: In the second game, you 614 00:24:54,960 --> 00:24:59,404 said that the people preferred to pay less because they don't 615 00:24:59,404 --> 00:25:02,890 want to go with uncertainty. 616 00:25:02,890 --> 00:25:04,980 ANDREW LO: The uncertainty about the risk. 617 00:25:04,980 --> 00:25:07,948 AUDIENCE: But uncertainty about the risk is unimportant. 618 00:25:07,948 --> 00:25:08,947 ANDREW LO: That's right. 619 00:25:08,947 --> 00:25:10,655 AUDIENCE: They don't understand the game. 620 00:25:10,655 --> 00:25:13,240 This is the reason why they prefer to [INAUDIBLE].. 621 00:25:13,240 --> 00:25:16,030 ANDREW LO: Well, let me put it this way. 622 00:25:16,030 --> 00:25:19,270 All of you here, I think, are pretty sophisticated. 623 00:25:19,270 --> 00:25:22,090 All of you, by the way, understand the mathematics 624 00:25:22,090 --> 00:25:23,320 of this game, I believe. 625 00:25:23,320 --> 00:25:23,860 Right? 626 00:25:23,860 --> 00:25:25,960 There's not that many moving parts. 627 00:25:25,960 --> 00:25:29,500 If we were to do an anonymous survey where 628 00:25:29,500 --> 00:25:31,540 I forced you to play with me-- 629 00:25:31,540 --> 00:25:33,640 So, in other words, I deducted $5,000 630 00:25:33,640 --> 00:25:35,679 from your payroll, so you're playing with me, 631 00:25:35,679 --> 00:25:37,220 no matter whether you like it or not. 632 00:25:37,220 --> 00:25:39,310 You just paid $5,000. 633 00:25:39,310 --> 00:25:45,630 But I did let you choose whether you play with urn A or urn B. 634 00:25:45,630 --> 00:25:48,360 I bet you the result, even in as sophisticated 635 00:25:48,360 --> 00:25:53,370 an audience as this, is that the vast majority would pick urn A, 636 00:25:53,370 --> 00:25:55,650 not urn B. You don't have to raise your hands. 637 00:25:55,650 --> 00:25:57,270 I won't put you on the spot. 638 00:25:57,270 --> 00:26:00,390 But my guess is that most of you, if not all of you, 639 00:26:00,390 --> 00:26:03,630 would end up picking urn A simply because you then can 640 00:26:03,630 --> 00:26:07,170 evaluate the uncertainty, whereas with B, you can't. 641 00:26:07,170 --> 00:26:09,540 So you're right that, mathematically, they're 642 00:26:09,540 --> 00:26:10,500 the same. 643 00:26:10,500 --> 00:26:13,500 But the fact is that we shy away from anything 644 00:26:13,500 --> 00:26:16,110 we don't understand, and so what that suggests 645 00:26:16,110 --> 00:26:19,264 is that there are layers of risk. 646 00:26:19,264 --> 00:26:21,180 In fact, there was an economist many years ago 647 00:26:21,180 --> 00:26:24,450 by the name of Frank Knight who was trying 648 00:26:24,450 --> 00:26:28,710 to explain how it was that certain people in the economy 649 00:26:28,710 --> 00:26:30,730 were so much richer than others. 650 00:26:30,730 --> 00:26:33,630 In fact, he was trying to explain the great wealth 651 00:26:33,630 --> 00:26:35,880 of the big mega industrialists. 652 00:26:35,880 --> 00:26:39,300 These are the folks like Pierpont Morgan who was worth, 653 00:26:39,300 --> 00:26:42,420 at that time, $100 million, which 654 00:26:42,420 --> 00:26:44,160 was an extraordinary amount of wealth. 655 00:26:44,160 --> 00:26:48,030 The modern day version today is Bill Gates. 656 00:26:48,030 --> 00:26:51,330 The question is, should Bill Gates 657 00:26:51,330 --> 00:26:55,440 be worth $40 billion or $50 billion? 658 00:26:55,440 --> 00:26:57,540 Does that make sense? 659 00:26:57,540 --> 00:27:00,480 I mean, is he really that much more valuable 660 00:27:00,480 --> 00:27:02,910 in any way than any of us? 661 00:27:02,910 --> 00:27:06,050 Why should somebody like that be worth so much more money? 662 00:27:06,050 --> 00:27:08,960 And Frank Knight struggled with this 663 00:27:08,960 --> 00:27:10,910 because he didn't understand how, 664 00:27:10,910 --> 00:27:14,340 from an economic perspective, you could justify it. 665 00:27:14,340 --> 00:27:17,910 And he finally came up with the following hypothesis 666 00:27:17,910 --> 00:27:22,050 or proposal for why these entrepreneurs 667 00:27:22,050 --> 00:27:24,180 make so much money. 668 00:27:24,180 --> 00:27:29,070 It's because they are taking on a kind of randomness that 669 00:27:29,070 --> 00:27:31,132 is not quantifiable. 670 00:27:31,132 --> 00:27:33,090 In other words, there's two kinds of randomness 671 00:27:33,090 --> 00:27:34,620 in Frank Knight's perspective. 672 00:27:34,620 --> 00:27:36,580 One is risk. 673 00:27:36,580 --> 00:27:38,660 Risk is quantifiable. 674 00:27:38,660 --> 00:27:40,624 For example, mortality rates. 675 00:27:40,624 --> 00:27:43,040 I don't know how many of you have looked at life insurance 676 00:27:43,040 --> 00:27:43,950 policies. 677 00:27:43,950 --> 00:27:47,492 But if you have, as I did when I first had my first child, 678 00:27:47,492 --> 00:27:49,700 I decided to learn a little bit about life insurance, 679 00:27:49,700 --> 00:27:53,330 and read mortality tables, and looked at life expectancies, 680 00:27:53,330 --> 00:27:54,770 and so on. 681 00:27:54,770 --> 00:27:57,530 And I found it enormously depressing 682 00:27:57,530 --> 00:28:02,390 because it turns out that, if you 683 00:28:02,390 --> 00:28:10,880 tell me 15 facts about your life, with those 15 facts 684 00:28:10,880 --> 00:28:13,220 I can actually pinpoint your life 685 00:28:13,220 --> 00:28:17,420 expectancy to within plus or minus 5 years, 686 00:28:17,420 --> 00:28:20,410 with an enormous degree of accuracy. 687 00:28:20,410 --> 00:28:23,220 And, as far as I'm concerned, that's much more information 688 00:28:23,220 --> 00:28:24,900 than I need. 689 00:28:24,900 --> 00:28:27,245 I don't need to know when I'm going to die. 690 00:28:27,245 --> 00:28:27,870 Think about it. 691 00:28:27,870 --> 00:28:30,360 Actually, as economists we always say, 692 00:28:30,360 --> 00:28:32,590 the more information you have, the better. 693 00:28:32,590 --> 00:28:34,040 Do you really believe that? 694 00:28:34,040 --> 00:28:35,540 Do you want to know when you're going to die? 695 00:28:35,540 --> 00:28:37,040 If I had that information, would you 696 00:28:37,040 --> 00:28:39,320 want me to tell you when you're going to die? 697 00:28:39,320 --> 00:28:41,230 The specific day and time? 698 00:28:41,230 --> 00:28:42,710 Would you like to know? 699 00:28:42,710 --> 00:28:44,030 Think about that. 700 00:28:44,030 --> 00:28:46,280 Well, it turns out with these mortality tables, 701 00:28:46,280 --> 00:28:48,260 whether or not you've ever smoked, 702 00:28:48,260 --> 00:28:51,170 do you drink, what kind of car do you drive, have you ever 703 00:28:51,170 --> 00:28:52,130 had an accident? 704 00:28:52,130 --> 00:28:54,410 There is these 15 or 20 questions 705 00:28:54,410 --> 00:28:57,600 that, once you answer them, you get put into a box. 706 00:28:57,600 --> 00:29:00,560 And that box has a life expectancy that ends up 707 00:29:00,560 --> 00:29:03,060 being pretty darned accurate. 708 00:29:03,060 --> 00:29:06,980 Now, the point of this is that the life insurance business 709 00:29:06,980 --> 00:29:09,410 is not currently a growth business. 710 00:29:09,410 --> 00:29:11,360 I don't think anybody here is dying 711 00:29:11,360 --> 00:29:15,140 to graduate from Sloan to start a life insurance company. 712 00:29:15,140 --> 00:29:15,870 Why is that? 713 00:29:15,870 --> 00:29:19,310 It's because there's not that much profit left, 714 00:29:19,310 --> 00:29:22,310 given that it's relatively well understood how 715 00:29:22,310 --> 00:29:24,549 to forecast mortality rates. 716 00:29:24,549 --> 00:29:26,840 And once you forecast them, you can price the insurance 717 00:29:26,840 --> 00:29:29,870 in such a way that you'll be slightly profitable, 718 00:29:29,870 --> 00:29:32,900 and that's pretty much all you're going to do. 719 00:29:32,900 --> 00:29:38,540 So the profit opportunities for the kinds of randomness 720 00:29:38,540 --> 00:29:42,800 that you can define very precisely, like mortality 721 00:29:42,800 --> 00:29:47,120 tables, or property and casualty insurance, 722 00:29:47,120 --> 00:29:49,610 there's not a lot of profit in that anymore. 723 00:29:49,610 --> 00:29:52,490 Maybe for the first person who did it, yes, there was. 724 00:29:52,490 --> 00:29:54,440 But by that time, at that time, there 725 00:29:54,440 --> 00:29:56,221 wasn't a lot of data on this. 726 00:29:56,221 --> 00:29:56,720 OK. 727 00:29:56,720 --> 00:29:58,340 So Knight said that you don't get 728 00:29:58,340 --> 00:30:04,310 rewarded with untold sums of wealth for taking on risk. 729 00:30:04,310 --> 00:30:09,080 What you get rewarded for with untold sums of wealth 730 00:30:09,080 --> 00:30:12,050 is for taking on uncertainty. 731 00:30:12,050 --> 00:30:15,140 Uncertainty is what Frank Knight called the risk 732 00:30:15,140 --> 00:30:17,720 that nobody can parameterize. 733 00:30:17,720 --> 00:30:19,354 So, nanotechnology. 734 00:30:19,354 --> 00:30:21,770 Who the heck knows whether or not that's going to pay off? 735 00:30:21,770 --> 00:30:24,590 We don't even know what that is, or whether it's really 736 00:30:24,590 --> 00:30:25,730 a business. 737 00:30:25,730 --> 00:30:29,570 That kind of uncertainty, if you're willing to take that on, 738 00:30:29,570 --> 00:30:32,540 then there are those who will make literally 739 00:30:32,540 --> 00:30:34,410 billions of dollars. 740 00:30:34,410 --> 00:30:37,220 And so the profits that economists would argue 741 00:30:37,220 --> 00:30:39,200 have to be in equilibrium, zero, right? 742 00:30:39,200 --> 00:30:41,120 Zero profit condition is typically 743 00:30:41,120 --> 00:30:43,460 what an economic equilibrium consists of. 744 00:30:43,460 --> 00:30:46,220 Zero profit only applies to the risks 745 00:30:46,220 --> 00:30:49,960 that can be parameterized, that can be well understood. 746 00:30:49,960 --> 00:30:54,640 And the point is that the human tendency is to shy away 747 00:30:54,640 --> 00:30:58,090 from the uncertainties of life. 748 00:30:58,090 --> 00:31:00,970 We don't like things we don't understand, 749 00:31:00,970 --> 00:31:03,050 and we will shrink from those. 750 00:31:03,050 --> 00:31:06,100 So if you are wired in a different way 751 00:31:06,100 --> 00:31:08,830 to be willing to take those kinds of risks, 752 00:31:08,830 --> 00:31:11,561 you're going to get paid huge sums of money for that. 753 00:31:11,561 --> 00:31:12,060 Ike? 754 00:31:12,060 --> 00:31:15,614 AUDIENCE: In terms of the zero sum math game, what about 755 00:31:15,614 --> 00:31:16,780 the silence of the cemetery? 756 00:31:16,780 --> 00:31:20,255 That Bill Gates is offset by the thousands of entrepenuers 757 00:31:20,255 --> 00:31:22,330 who've failed and are worth nothing? 758 00:31:22,330 --> 00:31:25,540 So, on average, they're all mortal. 759 00:31:25,540 --> 00:31:28,300 ANDREW LO: Well, that may be true that, on average, it all 760 00:31:28,300 --> 00:31:29,020 works out. 761 00:31:29,020 --> 00:31:32,560 But the point is that why do you have such a huge disparity 762 00:31:32,560 --> 00:31:34,690 for certain kinds of randomness whereas, 763 00:31:34,690 --> 00:31:37,250 for other kinds of randomness, you have a nice bell curve? 764 00:31:37,250 --> 00:31:39,190 In other words, why do you have this kind 765 00:31:39,190 --> 00:31:40,944 of tail risk going on? 766 00:31:40,944 --> 00:31:42,610 The tail risk that you're talking about? 767 00:31:42,610 --> 00:31:43,900 The silence of the cemeteries. 768 00:31:43,900 --> 00:31:45,940 The fact that, for every one Bill Gates, 769 00:31:45,940 --> 00:31:48,440 you've got several million failed entrepreneurs, 770 00:31:48,440 --> 00:31:51,490 is because people are taking on the kind of randomness 771 00:31:51,490 --> 00:31:53,530 that nobody knows how to quantify. 772 00:31:53,530 --> 00:31:54,410 That's the argument. 773 00:31:54,410 --> 00:31:57,090 So it's consistent with that perspective. 774 00:31:57,090 --> 00:31:57,880 OK? 775 00:31:57,880 --> 00:32:00,880 So that's why VCs and entrepreneurs-- 776 00:32:00,880 --> 00:32:03,070 those of you who want to do your start-ups, 777 00:32:03,070 --> 00:32:04,420 I wish you great success. 778 00:32:04,420 --> 00:32:08,830 And in this classroom, I will bet that at least one of you 779 00:32:08,830 --> 00:32:12,110 will be a billionaire in 15 years. 780 00:32:12,110 --> 00:32:14,810 And I want to get to know you very well between now and then. 781 00:32:17,360 --> 00:32:21,050 But the downside, which all of you should know, 782 00:32:21,050 --> 00:32:24,380 the downside is that for the rest of the entrepreneurs that 783 00:32:24,380 --> 00:32:28,760 don't make a billion dollars, you'll make nothing. 784 00:32:28,760 --> 00:32:29,360 Zero. 785 00:32:29,360 --> 00:32:31,250 You'll lose everything. 786 00:32:31,250 --> 00:32:33,470 Now, that's a reasonable trade-off 787 00:32:33,470 --> 00:32:38,480 if your particular psychological makeup 788 00:32:38,480 --> 00:32:40,940 enjoys that kind of a thrill. 789 00:32:40,940 --> 00:32:42,680 But it's not for everybody, and that's 790 00:32:42,680 --> 00:32:44,611 why people get paid what they get paid. 791 00:32:44,611 --> 00:32:45,110 Right? 792 00:32:45,110 --> 00:32:49,130 So there is some logic to this kind of puzzle. 793 00:32:49,130 --> 00:32:52,100 By the way, this particular example 794 00:32:52,100 --> 00:32:54,080 was developed also by a psychologist 795 00:32:54,080 --> 00:32:55,220 by the name of Ellsberg. 796 00:32:55,220 --> 00:32:57,770 And it's called the Ellsberg paradox 797 00:32:57,770 --> 00:32:59,840 because this is a situation where, 798 00:32:59,840 --> 00:33:02,090 despite the fact that the odds are the same, 799 00:33:02,090 --> 00:33:04,370 people don't like the uncertainties. 800 00:33:04,370 --> 00:33:08,240 And to me, and to a lot of evolutionary biologists, 801 00:33:08,240 --> 00:33:11,030 and neurophysiologists, this is not surprising. 802 00:33:11,030 --> 00:33:13,654 Because we are hardwired to stay away from stuff 803 00:33:13,654 --> 00:33:15,320 we don't know because it's the stuff you 804 00:33:15,320 --> 00:33:17,060 don't know that can kill you. 805 00:33:17,060 --> 00:33:18,890 And from an evolutionary perspective, 806 00:33:18,890 --> 00:33:20,660 it makes a lot of sense. 807 00:33:20,660 --> 00:33:21,480 Question? 808 00:33:21,480 --> 00:33:24,175 AUDIENCE: I feel like a big part of what got us in this whole 809 00:33:24,175 --> 00:33:27,498 mess is that [INAUDIBLE] came unwired-- 810 00:33:27,498 --> 00:33:29,860 People actually were chasing stuff they didn't know 811 00:33:29,860 --> 00:33:31,342 and paying more for it. 812 00:33:31,342 --> 00:33:33,800 ANDREW LO: Well, I think it's not so much they were chasing 813 00:33:33,800 --> 00:33:34,970 what they didn't know but rather, 814 00:33:34,970 --> 00:33:37,344 they were chasing stuff that they thought they knew that, 815 00:33:37,344 --> 00:33:38,371 in fact, they didn't. 816 00:33:38,371 --> 00:33:39,870 So we're going to come back to that. 817 00:33:39,870 --> 00:33:42,770 Let me hold off on that for just 15 minutes. 818 00:33:42,770 --> 00:33:45,410 I want to take that head on when we actually look 819 00:33:45,410 --> 00:33:47,000 at the applications to finance. 820 00:33:47,000 --> 00:33:47,676 Zeke? 821 00:33:47,676 --> 00:33:49,759 AUDIENCE: I think there's also like an [INAUDIBLE] 822 00:33:49,759 --> 00:33:51,007 version of this as well. 823 00:33:51,007 --> 00:33:52,090 It's because there's not-- 824 00:33:52,090 --> 00:33:54,530 there's also not doing your homework. 825 00:33:54,530 --> 00:33:56,695 Taking on some uncertainty whereas you 826 00:33:56,695 --> 00:33:57,694 could be taking risk. 827 00:33:57,694 --> 00:33:58,360 ANDREW LO: Sure. 828 00:33:58,360 --> 00:33:59,360 Absolutely. 829 00:33:59,360 --> 00:34:04,220 It's possible that this kind of risk is not systematic risk. 830 00:34:04,220 --> 00:34:05,960 And there's nothing that says that you 831 00:34:05,960 --> 00:34:09,050 ought to get compensated for that, except for the fact that, 832 00:34:09,050 --> 00:34:11,840 if you're talking about very, very, very 833 00:34:11,840 --> 00:34:15,290 unusual kinds of technologies, or businesses 834 00:34:15,290 --> 00:34:19,350 that nobody really understands or is able to take on, 835 00:34:19,350 --> 00:34:21,020 that's where the multibillion dollar 836 00:34:21,020 --> 00:34:23,000 paydays are going to come from. 837 00:34:23,000 --> 00:34:25,460 It's not going to come from starting up a news stand, 838 00:34:25,460 --> 00:34:28,900 or setting up another restaurant, or selling coffee. 839 00:34:28,900 --> 00:34:29,989 Well, coffee, maybe. 840 00:34:29,989 --> 00:34:33,590 Let me take that back because we have some counter-examples. 841 00:34:33,590 --> 00:34:35,820 But the point is that, if it's commoditized, 842 00:34:35,820 --> 00:34:37,699 if everybody understands how to do it, 843 00:34:37,699 --> 00:34:41,290 then the profits ultimately are going to get bid down to zero. 844 00:34:41,290 --> 00:34:41,840 OK. 845 00:34:41,840 --> 00:34:42,472 Yeah? 846 00:34:42,472 --> 00:34:44,664 AUDIENCE: It's also brought to your attention, 847 00:34:44,664 --> 00:34:47,380 you know, Warren Buffet's averages 848 00:34:47,380 --> 00:34:51,532 of the normal conventions of, you know, [INAUDIBLE].. 849 00:34:51,532 --> 00:34:55,350 Then he says [INAUDIBLE] standard of [INAUDIBLE].. 850 00:34:55,350 --> 00:34:55,850 Right? 851 00:34:55,850 --> 00:34:58,880 So that's, that's a huge, that's a huge uncertainty factor. 852 00:34:58,880 --> 00:35:01,280 But yet he's still [INAUDIBLE]. 853 00:35:01,280 --> 00:35:02,280 ANDREW LO: That's right. 854 00:35:02,280 --> 00:35:04,930 He may have made money in other ways. 855 00:35:04,930 --> 00:35:08,380 Not necessarily by taking risks that nobody else understood, 856 00:35:08,380 --> 00:35:10,510 but rather by being early in taking 857 00:35:10,510 --> 00:35:13,330 advantage of opportunities that other people didn't see. 858 00:35:13,330 --> 00:35:16,910 So there's more than one way to make a lot of money. 859 00:35:16,910 --> 00:35:19,600 And we'll talk a bit about that at the end, when 860 00:35:19,600 --> 00:35:23,440 I talk about how evolutionary pressures ultimately allow 861 00:35:23,440 --> 00:35:25,360 businesses to fail and succeed. 862 00:35:25,360 --> 00:35:28,076 So I'll come back to that. 863 00:35:28,076 --> 00:35:29,950 Any other questions about the example though? 864 00:35:29,950 --> 00:35:30,450 OK. 865 00:35:30,450 --> 00:35:33,880 So we've got two examples of irrationalities that all of us 866 00:35:33,880 --> 00:35:35,500 seem to possess. 867 00:35:35,500 --> 00:35:37,540 And the third example that I give 868 00:35:37,540 --> 00:35:39,080 is one that I think you have seen, 869 00:35:39,080 --> 00:35:41,350 which is this video clip of this gorilla. 870 00:35:41,350 --> 00:35:42,250 Right? 871 00:35:42,250 --> 00:35:45,950 This is a clear example where we are all 872 00:35:45,950 --> 00:35:51,190 hardwired to see certain things that we are looking to see, 873 00:35:51,190 --> 00:35:53,650 and when we're not looking for other things, 874 00:35:53,650 --> 00:35:55,660 we don't see them at all. 875 00:35:55,660 --> 00:35:59,710 And this final example I give because it provides, I think, 876 00:35:59,710 --> 00:36:05,560 incontrovertible evidence that human cognitive abilities 877 00:36:05,560 --> 00:36:07,010 are limited. 878 00:36:07,010 --> 00:36:10,030 We are not infinitely rational in the sense 879 00:36:10,030 --> 00:36:14,680 that we can compute, observe, analyze, forecast, 880 00:36:14,680 --> 00:36:19,330 and remember, an infinite amount of information. 881 00:36:19,330 --> 00:36:24,160 And so, after all of these examples, this is why I said, 882 00:36:24,160 --> 00:36:26,860 you must think that the human animal is the stupidest 883 00:36:26,860 --> 00:36:28,420 creature on earth. 884 00:36:28,420 --> 00:36:31,690 Because we suffer from all of these behavioral biases. 885 00:36:31,690 --> 00:36:34,720 So the behaviorists would conclude by saying, 886 00:36:34,720 --> 00:36:36,370 we rest our case. 887 00:36:36,370 --> 00:36:38,470 People are irrational, therefore how 888 00:36:38,470 --> 00:36:42,130 can you rely on markets for making any kind of decisions? 889 00:36:42,130 --> 00:36:47,410 Well, the official markets types have a pretty powerful retort. 890 00:36:47,410 --> 00:36:49,630 And the powerful retort comes in the form 891 00:36:49,630 --> 00:36:51,490 of what's known as the Dutch Book theorem. 892 00:36:51,490 --> 00:36:53,440 This is an example that I also give 893 00:36:53,440 --> 00:36:56,410 to try to illustrate how it could be that, 894 00:36:56,410 --> 00:36:59,140 even though there are crazy people some of the time, 895 00:36:59,140 --> 00:37:01,810 and certain people that are crazy all of the time, 896 00:37:01,810 --> 00:37:04,840 that nevertheless the smart money will win. 897 00:37:04,840 --> 00:37:06,400 So the example goes like this. 898 00:37:06,400 --> 00:37:09,880 You've got an event A that we're going to bet on, OK? 899 00:37:09,880 --> 00:37:13,540 And the event is that the S&P 500 falls by 5% or more 900 00:37:13,540 --> 00:37:15,257 next Monday. 901 00:37:15,257 --> 00:37:17,340 This used to be considered a pretty extreme event. 902 00:37:17,340 --> 00:37:20,700 Now it's business as usual, right? 903 00:37:20,700 --> 00:37:21,930 OK. 904 00:37:21,930 --> 00:37:25,770 So let's suppose that you, as the irrational, behavioral, 905 00:37:25,770 --> 00:37:29,550 psychological creature that behavioral economists would 906 00:37:29,550 --> 00:37:32,700 argue you are, you have the following probability belief. 907 00:37:32,700 --> 00:37:36,270 The probability of A is 1/2 and the probability of NOT A 908 00:37:36,270 --> 00:37:38,890 is 3/4. 909 00:37:38,890 --> 00:37:42,000 Now some of you are chuckling because you would never 910 00:37:42,000 --> 00:37:43,110 have such beliefs, right? 911 00:37:43,110 --> 00:37:45,690 Because the probabilities, they don't add up to 1. 912 00:37:45,690 --> 00:37:48,180 Well, I guarantee each and every one of you 913 00:37:48,180 --> 00:37:50,370 if I got you into a darkly-lit room, 914 00:37:50,370 --> 00:37:53,100 and I asked you a series of questions over time, 915 00:37:53,100 --> 00:37:56,670 you would ultimately end up exhibiting biased probabilities 916 00:37:56,670 --> 00:37:59,430 just like this, because the human cognitive 917 00:37:59,430 --> 00:38:03,360 faculties are not designed to make correct, probabilistic 918 00:38:03,360 --> 00:38:03,990 inferences. 919 00:38:03,990 --> 00:38:06,210 This is not natural, that we know 920 00:38:06,210 --> 00:38:09,360 how to make probabilistic, just judgments. 921 00:38:09,360 --> 00:38:12,360 So let's suppose that this is what you believe. 922 00:38:12,360 --> 00:38:14,460 Now, what's the problem with this irrationality? 923 00:38:14,460 --> 00:38:16,140 Well, the problem is that you would 924 00:38:16,140 --> 00:38:20,990 be willing to take both of the following two bets. 925 00:38:20,990 --> 00:38:26,210 Bet B1 is a bet where if A occurs, I pay you $1. 926 00:38:26,210 --> 00:38:28,880 If A doesn't occur, you pay me $1. 927 00:38:28,880 --> 00:38:29,454 OK? 928 00:38:29,454 --> 00:38:31,370 And the reason you're willing to take this bet 929 00:38:31,370 --> 00:38:36,740 is because the odds of A occurring are 1/2. 930 00:38:36,740 --> 00:38:39,680 1/2 means 50/50, and so you're willing to take 931 00:38:39,680 --> 00:38:43,750 those 50/50 odds with bet B1. 932 00:38:43,750 --> 00:38:45,970 However, because you also believe 933 00:38:45,970 --> 00:38:49,280 that the probability of not A is 3/4, 934 00:38:49,280 --> 00:38:51,250 which is one to three odds, right? 935 00:38:51,250 --> 00:38:52,420 1/4 to 3/4. 936 00:38:52,420 --> 00:38:53,530 One to three odds. 937 00:38:53,530 --> 00:38:58,170 Because of that fact, you're also willing to take bet B2. 938 00:38:58,170 --> 00:39:04,030 Bet B2 is a bet where I pay you $1 if A doesn't occur, 939 00:39:04,030 --> 00:39:09,270 but you pay me $3 if A does occur. 940 00:39:09,270 --> 00:39:11,940 And again, the reason you are willing to do that 941 00:39:11,940 --> 00:39:16,050 is because you believe that the probability of A not occurring 942 00:39:16,050 --> 00:39:18,900 is 3/4. 943 00:39:18,900 --> 00:39:22,790 Now, if that's the case, you're willing to take bet B1 and B2, 944 00:39:22,790 --> 00:39:24,770 here's what we're going to do. 945 00:39:24,770 --> 00:39:31,900 I'm going to place $50 on B1 and $25 on B2. 946 00:39:31,900 --> 00:39:36,450 That's the wager that I want to engage in with you. 947 00:39:36,450 --> 00:39:39,660 If we do that, then let's see what happens. 948 00:39:39,660 --> 00:39:46,140 If A occurs, I lose my $50 on B1 but I win three to one odds 949 00:39:46,140 --> 00:39:49,890 on my $25 on B2, so I win $75 on B2, 950 00:39:49,890 --> 00:39:55,380 so I make $25 profit if A occurs. 951 00:39:55,380 --> 00:39:58,400 If A doesn't occur, what happens? 952 00:39:58,400 --> 00:40:02,820 Well, if A doesn't occur, I lose my $25 on B2, 953 00:40:02,820 --> 00:40:06,100 but I make $50 on B1. 954 00:40:06,100 --> 00:40:10,560 I make $25 again. 955 00:40:10,560 --> 00:40:13,410 Heads I win, tails you lose. 956 00:40:13,410 --> 00:40:17,580 No matter what happens, I make $25, 957 00:40:17,580 --> 00:40:20,460 so I'm going to like this a lot. 958 00:40:20,460 --> 00:40:24,390 And I'm going to keep doing it, and doing it, and doing it, 959 00:40:24,390 --> 00:40:26,620 until one of two things happens. 960 00:40:26,620 --> 00:40:32,520 Either you run out of money, or you change your probability 961 00:40:32,520 --> 00:40:34,930 beliefs. 962 00:40:34,930 --> 00:40:39,220 And it turns out that there is only one set of probability 963 00:40:39,220 --> 00:40:44,680 beliefs for which I cannot construct this free lunch. 964 00:40:44,680 --> 00:40:46,460 You know that is? 965 00:40:46,460 --> 00:40:48,830 It's when the probabilities sum to one. 966 00:40:48,830 --> 00:40:52,970 So the point is that there is a limit to how irrational 967 00:40:52,970 --> 00:40:54,620 you can get. 968 00:40:54,620 --> 00:40:59,930 Because as long as I'm there to take advantage of you, 969 00:40:59,930 --> 00:41:03,120 I'm simply going to pump money out of you until you cry, 970 00:41:03,120 --> 00:41:03,620 uncle. 971 00:41:03,620 --> 00:41:06,370 Until you change your beliefs. 972 00:41:06,370 --> 00:41:06,870 Yeah? 973 00:41:06,870 --> 00:41:08,734 AUDIENCE: Are there any examples of where this has actually 974 00:41:08,734 --> 00:41:09,200 been exploited? 975 00:41:09,200 --> 00:41:10,116 ANDREW LO: Absolutely. 976 00:41:10,116 --> 00:41:11,490 There are tons of examples. 977 00:41:11,490 --> 00:41:13,530 There are tons of examples of mispricing. 978 00:41:13,530 --> 00:41:15,170 In fact, the example that I gave you 979 00:41:15,170 --> 00:41:16,920 when we were doing fixed income arbitrage, 980 00:41:16,920 --> 00:41:20,100 when we had these different treasury securities 981 00:41:20,100 --> 00:41:24,300 that had different yields for the very, very same maturities, 982 00:41:24,300 --> 00:41:27,450 and you could solve using simultaneous linear equations 983 00:41:27,450 --> 00:41:28,740 for the arbitrage. 984 00:41:28,740 --> 00:41:32,150 That is exactly this example. 985 00:41:32,150 --> 00:41:36,490 So there are many, many examples where this occurs. 986 00:41:36,490 --> 00:41:40,630 And it doesn't last, typically, because people 987 00:41:40,630 --> 00:41:43,900 get tired of losing money, and ultimately either they 988 00:41:43,900 --> 00:41:48,010 run out of that money, or they get wise. 989 00:41:48,010 --> 00:41:51,190 So the idea behind this argument that efficient markets 990 00:41:51,190 --> 00:41:55,150 folks make is that, sure, people may be irrational from time 991 00:41:55,150 --> 00:41:56,950 to time, and there may be some people that 992 00:41:56,950 --> 00:41:59,410 are always irrational, but as long 993 00:41:59,410 --> 00:42:02,750 as there are smart people out there, as long as there's 994 00:42:02,750 --> 00:42:05,920 smart money out there, then it doesn't matter. 995 00:42:05,920 --> 00:42:10,000 Because the probabilities will sum to one through market 996 00:42:10,000 --> 00:42:11,840 forces. 997 00:42:11,840 --> 00:42:16,520 And so then the question is, how strong are those market forces? 998 00:42:16,520 --> 00:42:18,740 And I would point to the current crisis 999 00:42:18,740 --> 00:42:22,100 to tell you that those market forces, 1000 00:42:22,100 --> 00:42:26,690 while if they sound good on paper and, at a place like MIT, 1001 00:42:26,690 --> 00:42:31,010 I promise you that what John Maynard Keynes said decades ago 1002 00:42:31,010 --> 00:42:32,270 is still true. 1003 00:42:32,270 --> 00:42:36,380 Which is, the market can stay irrational much, much longer 1004 00:42:36,380 --> 00:42:39,200 than you can stay solvent. 1005 00:42:39,200 --> 00:42:41,930 In other words, if you are the only sane person 1006 00:42:41,930 --> 00:42:45,260 in an irrational world, you are the one with the problem. 1007 00:42:45,260 --> 00:42:48,680 Because you can't take advantage of these opportunities 1008 00:42:48,680 --> 00:42:55,510 before ultimately the forces of irrationality overwhelm you. 1009 00:42:55,510 --> 00:42:58,470 At least that's what John Maynard Keynes thought. 1010 00:42:58,470 --> 00:43:01,090 Now, it seems to suggest then that it's 1011 00:43:01,090 --> 00:43:03,310 a question about magnitudes. 1012 00:43:03,310 --> 00:43:06,510 In other words, you've got smart people here, 1013 00:43:06,510 --> 00:43:08,680 you've got irrational people here-- 1014 00:43:08,680 --> 00:43:15,020 They're going to interact and may the bigger pocketbook win. 1015 00:43:15,020 --> 00:43:17,990 Question is, who's got the bigger pocketbook? 1016 00:43:17,990 --> 00:43:21,350 Who's got the stronger set of forces? 1017 00:43:21,350 --> 00:43:26,140 The irrational guys or the efficient markets guys? 1018 00:43:26,140 --> 00:43:28,150 And before I answer that question, 1019 00:43:28,150 --> 00:43:31,330 or at least propose an answer to that question, 1020 00:43:31,330 --> 00:43:33,850 I need to now beg your indulgence 1021 00:43:33,850 --> 00:43:36,610 for the next 15 or 20 minutes because we're 1022 00:43:36,610 --> 00:43:38,560 going to take a little bit of a detour. 1023 00:43:38,560 --> 00:43:40,720 I want to bring in some evidence from 1024 00:43:40,720 --> 00:43:44,230 the cognitive neurosciences that will speak directly 1025 00:43:44,230 --> 00:43:49,030 to this issue of what we even mean by rationality. 1026 00:43:49,030 --> 00:43:53,710 It turns out that what economists mean by rational 1027 00:43:53,710 --> 00:43:59,590 is not what you and I take to be rational from a daily, 1028 00:43:59,590 --> 00:44:03,060 kind of, activities type of perspective. 1029 00:44:03,060 --> 00:44:05,510 So I want to tell you about some research 1030 00:44:05,510 --> 00:44:08,980 in the cognitive neurosciences that really 1031 00:44:08,980 --> 00:44:12,490 changes the way we think about rationality 1032 00:44:12,490 --> 00:44:14,050 and decision-making. 1033 00:44:14,050 --> 00:44:18,070 To do that, I'm going to tell you about a particular case 1034 00:44:18,070 --> 00:44:20,680 study that was described in a book written 1035 00:44:20,680 --> 00:44:24,070 by a neuroscientist by the name of Antonio Damasio. 1036 00:44:24,070 --> 00:44:27,520 Damasio is a clinical neurologist, 1037 00:44:27,520 --> 00:44:30,880 meaning that he sees patients as well as does research. 1038 00:44:30,880 --> 00:44:35,784 And about 15 years ago, he wrote a book titled Descartes' Error, 1039 00:44:35,784 --> 00:44:37,450 and in a few minutes I'll explain to you 1040 00:44:37,450 --> 00:44:39,100 where that title comes from. 1041 00:44:39,100 --> 00:44:42,160 But in this book, he describes his experience 1042 00:44:42,160 --> 00:44:43,770 in dealing with one of his patients, 1043 00:44:43,770 --> 00:44:46,900 codenamed "Elliot," who developed a brain tumor that 1044 00:44:46,900 --> 00:44:48,250 had to be surgically removed. 1045 00:44:48,250 --> 00:44:51,430 And, as you may know, with these kinds of surgeries, 1046 00:44:51,430 --> 00:44:54,220 you've got to remove a healthy portion of the brain 1047 00:44:54,220 --> 00:44:55,690 as well as the tumor to make sure 1048 00:44:55,690 --> 00:44:57,065 that you've got all of the tumor, 1049 00:44:57,065 --> 00:45:00,460 and you don't leave anything remaining. 1050 00:45:00,460 --> 00:45:03,080 So after they took out this tumor, 1051 00:45:03,080 --> 00:45:05,830 and after Eliot recovered from the surgery, 1052 00:45:05,830 --> 00:45:09,160 they gave him a battery of tests to see whether or not 1053 00:45:09,160 --> 00:45:11,860 any of his cognitive abilities were impaired 1054 00:45:11,860 --> 00:45:14,680 because of this surgery. 1055 00:45:14,680 --> 00:45:18,610 They gave him a perception test, memory test, IQ test, 1056 00:45:18,610 --> 00:45:21,400 learning ability, language ability, arithmetic. 1057 00:45:21,400 --> 00:45:23,980 With every single test, Eliot passed 1058 00:45:23,980 --> 00:45:28,210 with flying colors, either average or well above-average. 1059 00:45:28,210 --> 00:45:34,240 And yet, within weeks after the recovery of his surgery, 1060 00:45:34,240 --> 00:45:37,870 he was fired from his job, his wife left him, 1061 00:45:37,870 --> 00:45:39,620 and, for all intents and purposes, 1062 00:45:39,620 --> 00:45:41,190 he had to be institutionalized. 1063 00:45:41,190 --> 00:45:44,350 He couldn't live on his own because his behavior became 1064 00:45:44,350 --> 00:45:46,340 so irrational. 1065 00:45:46,340 --> 00:45:50,200 Now let me describe to you what kind of irrationality 1066 00:45:50,200 --> 00:45:51,760 we're talking about. 1067 00:45:51,760 --> 00:45:58,150 Damasio describes what Elliot did when he was 1068 00:45:58,150 --> 00:46:00,590 engaged in a task at his job. 1069 00:46:00,590 --> 00:46:02,500 He was an accountant at the time. 1070 00:46:02,500 --> 00:46:05,290 And so Damasio writes, "When the job called 1071 00:46:05,290 --> 00:46:08,200 for interrupting an activity and turning to another, 1072 00:46:08,200 --> 00:46:10,010 he might persist nonetheless, seemingly 1073 00:46:10,010 --> 00:46:11,260 losing sight of his main goal. 1074 00:46:11,260 --> 00:46:13,030 Or he might interrupt the activity 1075 00:46:13,030 --> 00:46:14,530 he had engaged, to turn to something 1076 00:46:14,530 --> 00:46:16,930 he found more captivating at that particular moment. 1077 00:46:16,930 --> 00:46:19,870 The flow of work was stopped. 1078 00:46:19,870 --> 00:46:22,234 One might say that the particular step 1079 00:46:22,234 --> 00:46:23,650 of the task at which Elliot balked 1080 00:46:23,650 --> 00:46:25,790 was actually being carried out too well, 1081 00:46:25,790 --> 00:46:28,690 and at the expense of the overall purpose. 1082 00:46:28,690 --> 00:46:32,260 One might say that Eliot had become irrational concerning 1083 00:46:32,260 --> 00:46:34,800 the larger frame of behavior." 1084 00:46:34,800 --> 00:46:41,200 Now, what Damasio was telling you is a very simple example 1085 00:46:41,200 --> 00:46:43,210 that he gave in the book, where Eliot 1086 00:46:43,210 --> 00:46:46,320 was asked to write a letter to one of his clients. 1087 00:46:46,320 --> 00:46:49,680 He'd fire up the word processor and, before he 1088 00:46:49,680 --> 00:46:51,990 started typing that letter, he had 1089 00:46:51,990 --> 00:46:54,390 to pick just the right font. 1090 00:46:54,390 --> 00:46:57,550 And it took him three hours to pick that font. 1091 00:46:57,550 --> 00:46:59,980 Three hours, literally. 1092 00:46:59,980 --> 00:47:02,740 Now, those of you who use Microsoft Word, 1093 00:47:02,740 --> 00:47:04,172 I know that when you first got it, 1094 00:47:04,172 --> 00:47:05,380 you did the same thing I did. 1095 00:47:05,380 --> 00:47:07,480 You pulled down that font menu, and you 1096 00:47:07,480 --> 00:47:11,770 went through Arial, Sans Serif, Times New Roman. 1097 00:47:11,770 --> 00:47:14,950 You checked out all those fonts, but it probably 1098 00:47:14,950 --> 00:47:17,360 didn't take you three hours. 1099 00:47:17,360 --> 00:47:21,770 Moreover, it didn't take you three hours every time you 1100 00:47:21,770 --> 00:47:23,790 had to write a letter. 1101 00:47:23,790 --> 00:47:26,250 Well, it did for Eliot after the surgery. 1102 00:47:26,250 --> 00:47:31,020 Every letter he wrote took three hours at the start 1103 00:47:31,020 --> 00:47:35,570 because he had to pick just the right font. 1104 00:47:35,570 --> 00:47:42,050 And it didn't dawn on Damasio until he gave Elliot one test 1105 00:47:42,050 --> 00:47:46,740 that he finally failed as to what was going on. 1106 00:47:46,740 --> 00:47:48,580 And let me tell you what the test was. 1107 00:47:48,580 --> 00:47:51,520 Psychologists often use something 1108 00:47:51,520 --> 00:47:57,310 called eye-blink response rate to measure emotional reaction. 1109 00:47:57,310 --> 00:47:59,860 It turns out that for normal human beings, when 1110 00:47:59,860 --> 00:48:02,440 you have a strong emotional response, 1111 00:48:02,440 --> 00:48:04,850 you blink your eyes a little bit more quickly than usual. 1112 00:48:04,850 --> 00:48:05,350 All right? 1113 00:48:05,350 --> 00:48:08,020 Those of you who've ever seen the Betty Boop cartoons 1114 00:48:08,020 --> 00:48:10,300 where they have a female character trying 1115 00:48:10,300 --> 00:48:13,270 to flirt with somebody else, that the way they'll indicate 1116 00:48:13,270 --> 00:48:16,750 flirtation is blinking their eyes a little bit more quickly 1117 00:48:16,750 --> 00:48:18,130 or batting her eyelashes. 1118 00:48:18,130 --> 00:48:18,820 Right? 1119 00:48:18,820 --> 00:48:21,020 Those of you who are serious poker players, 1120 00:48:21,020 --> 00:48:23,290 you'll know that there's a poker tell, 1121 00:48:23,290 --> 00:48:26,380 meaning a physical characteristic 1122 00:48:26,380 --> 00:48:29,752 that will signal a strong, emotional reaction on the part 1123 00:48:29,752 --> 00:48:30,460 of your opponent. 1124 00:48:30,460 --> 00:48:32,590 And that is, when they look at their hand, 1125 00:48:32,590 --> 00:48:34,769 they blink their eyes a little bit more rapidly. 1126 00:48:34,769 --> 00:48:37,060 Either it's a really strong hand or a really weak hand. 1127 00:48:37,060 --> 00:48:39,070 That's, by the way, why many poker players-- 1128 00:48:39,070 --> 00:48:40,900 they wear sunglasses when they play. 1129 00:48:40,900 --> 00:48:45,890 Not to look cool but to hide their reactions. 1130 00:48:45,890 --> 00:48:47,420 What they do is they take a subject 1131 00:48:47,420 --> 00:48:49,211 and put them in front of a computer screen. 1132 00:48:49,211 --> 00:48:50,660 Attach electrodes to their eyelids 1133 00:48:50,660 --> 00:48:52,880 to measure how quickly they're blinking their eyes, 1134 00:48:52,880 --> 00:48:54,650 and they show the subject strong, 1135 00:48:54,650 --> 00:48:56,400 emotionally-charged images. 1136 00:48:56,400 --> 00:48:59,240 Things like pornography or scenes 1137 00:48:59,240 --> 00:49:01,910 from Nazi concentration camps. 1138 00:49:01,910 --> 00:49:03,890 And when we have strong emotional reactions, 1139 00:49:03,890 --> 00:49:07,730 you can measure the increase in the rate 1140 00:49:07,730 --> 00:49:10,440 at which we blink our eyes. 1141 00:49:10,440 --> 00:49:15,530 It turns out that Elliot had no change in his eye-blink 1142 00:49:15,530 --> 00:49:16,400 response rates. 1143 00:49:16,400 --> 00:49:20,310 None whatsoever. 1144 00:49:20,310 --> 00:49:22,200 And so Damasio finally asked Elliot, 1145 00:49:22,200 --> 00:49:26,250 "Elliot, how do you feel?" 1146 00:49:26,250 --> 00:49:28,270 And Elliot said, "You know, it's funny. 1147 00:49:28,270 --> 00:49:32,870 After the surgery, I observed that the things 1148 00:49:32,870 --> 00:49:37,600 that I used to enjoy before the surgery, I don't anymore. 1149 00:49:37,600 --> 00:49:40,810 For example, I used to like steak dinners, red wine, 1150 00:49:40,810 --> 00:49:42,190 and Mozart. 1151 00:49:42,190 --> 00:49:45,930 And I've experienced all of those things afterwards, 1152 00:49:45,930 --> 00:49:47,760 and I don't feel anything. 1153 00:49:47,760 --> 00:49:51,590 I know I should be enjoying them, but I don't. 1154 00:49:51,590 --> 00:49:52,920 I don't feel anything. 1155 00:49:52,920 --> 00:49:54,290 I don't dislike it. 1156 00:49:54,290 --> 00:49:58,605 I just don't really enjoy it as much as I know I should." 1157 00:49:58,605 --> 00:49:59,980 Now, I don't know how many of you 1158 00:49:59,980 --> 00:50:01,810 have had that kind of reaction. 1159 00:50:01,810 --> 00:50:03,550 I have fairly recently. 1160 00:50:03,550 --> 00:50:06,670 My eight-year-old son and I were at the supermarket 1161 00:50:06,670 --> 00:50:09,730 the other day, and I was going by the candy aisle, 1162 00:50:09,730 --> 00:50:12,830 and I noticed that there were Cracker Jacks on the shelf. 1163 00:50:12,830 --> 00:50:14,830 I don't know how many of you know what they are? 1164 00:50:14,830 --> 00:50:18,970 Peanut-coated popcorn candy and a prize, right? 1165 00:50:18,970 --> 00:50:21,490 Well, I loved Cracker Jacks when I was eight years old. 1166 00:50:21,490 --> 00:50:24,370 I remember that very distinctly, just really enjoying it. 1167 00:50:24,370 --> 00:50:27,250 So I decided I was going to get some for my eight-year-old. 1168 00:50:27,250 --> 00:50:30,730 So I got some for him, and I tried some as well. 1169 00:50:30,730 --> 00:50:34,390 And I remember thinking, gee, I know I should really 1170 00:50:34,390 --> 00:50:36,430 be enjoying this, but I'm not. 1171 00:50:36,430 --> 00:50:39,310 It's not-- Maybe my tastes have changed, I don't know, 1172 00:50:39,310 --> 00:50:43,240 but it's not nearly as wonderful as I remembered it to be. 1173 00:50:43,240 --> 00:50:48,370 Imagine having that feeling about everything. 1174 00:50:48,370 --> 00:50:53,770 All your fears, anxieties, your great desires, 1175 00:50:53,770 --> 00:50:57,111 all of your emotions, gone. 1176 00:50:57,111 --> 00:50:58,110 You don't feel anything. 1177 00:50:58,110 --> 00:51:00,980 You sort of feel numb inside. 1178 00:51:00,980 --> 00:51:04,370 It turns out that Damasio conjectured 1179 00:51:04,370 --> 00:51:08,120 that what we think of as rational behavior 1180 00:51:08,120 --> 00:51:11,570 actually requires emotion. 1181 00:51:11,570 --> 00:51:15,260 In other words, you have to be able to feel in order 1182 00:51:15,260 --> 00:51:17,870 to act rationally. 1183 00:51:17,870 --> 00:51:23,420 Now, that's really very different from the typical kind 1184 00:51:23,420 --> 00:51:26,780 of impression where rationality is at one end of the spectrum 1185 00:51:26,780 --> 00:51:29,414 and emotion, fear, and greed are at the other end 1186 00:51:29,414 --> 00:51:30,080 of the spectrum. 1187 00:51:30,080 --> 00:51:32,820 This is Descartes' perspective on the world. 1188 00:51:32,820 --> 00:51:34,820 This is why the book is titled Descartes' Error. 1189 00:51:34,820 --> 00:51:37,790 He's arguing that Descartes got it all wrong. 1190 00:51:37,790 --> 00:51:42,110 That, in fact, emotion is the opposite side of the same coin. 1191 00:51:42,110 --> 00:51:46,660 If you can't feel, then you can't be rational. 1192 00:51:46,660 --> 00:51:51,832 So this calls into question what we think of as rationality. 1193 00:51:51,832 --> 00:51:54,040 And I'm going to beg your indulgence for another five 1194 00:51:54,040 --> 00:51:58,120 minutes as I tell you more about the neurophysiological basis 1195 00:51:58,120 --> 00:51:59,530 for this observation. 1196 00:51:59,530 --> 00:52:02,410 It turns out that the part of the brain that they took out 1197 00:52:02,410 --> 00:52:06,010 from Elliot with his tumor is the part that 1198 00:52:06,010 --> 00:52:09,620 was responsible for emotional reaction. 1199 00:52:09,620 --> 00:52:12,910 And there have been a number of theories 1200 00:52:12,910 --> 00:52:15,800 that have been developed now about how the brain works. 1201 00:52:15,800 --> 00:52:19,420 So I want to describe probably the most popular theory, 1202 00:52:19,420 --> 00:52:21,329 at least as of about 10 years ago, 1203 00:52:21,329 --> 00:52:22,870 and you'll get a better understanding 1204 00:52:22,870 --> 00:52:23,703 for what's going on. 1205 00:52:23,703 --> 00:52:27,430 And this is the part that I think will change your lives. 1206 00:52:27,430 --> 00:52:31,580 So it turns out that the brain, which is pictured here 1207 00:52:31,580 --> 00:52:36,370 in cross-section, is not nearly the homogeneous organ 1208 00:52:36,370 --> 00:52:39,754 that we often take it to be. 1209 00:52:39,754 --> 00:52:41,420 It turns out that the brain is comprised 1210 00:52:41,420 --> 00:52:42,740 of different components. 1211 00:52:42,740 --> 00:52:45,950 And a neuroscientist by the name of Paul MacLean 1212 00:52:45,950 --> 00:52:48,200 about 10 years ago came up with what's 1213 00:52:48,200 --> 00:52:51,260 now called the triune model of the brain, which 1214 00:52:51,260 --> 00:52:55,980 decomposes the brain into three separate components. 1215 00:52:55,980 --> 00:52:59,220 The first component is located at the bottom of the brain. 1216 00:52:59,220 --> 00:53:01,550 It's also known as the brain stem. 1217 00:53:01,550 --> 00:53:03,080 It's the part of the brain that sits 1218 00:53:03,080 --> 00:53:05,670 on top of your spinal cord, and all 1219 00:53:05,670 --> 00:53:08,840 of the nerves that go up through your spinal cord 1220 00:53:08,840 --> 00:53:11,940 attaches into that brain stem. 1221 00:53:11,940 --> 00:53:15,030 And neuroscientists have determined that the brain stem 1222 00:53:15,030 --> 00:53:18,870 is responsible for all of the, sort of, bodily functions that 1223 00:53:18,870 --> 00:53:21,580 you take for granted, and typically you cannot control, 1224 00:53:21,580 --> 00:53:23,670 unless you you're a Zen Sufi master. 1225 00:53:23,670 --> 00:53:26,820 Things like heart rate, breathing, 1226 00:53:26,820 --> 00:53:28,620 body temperature regulation, things that we 1227 00:53:28,620 --> 00:53:30,600 need for basic survival. 1228 00:53:30,600 --> 00:53:33,510 And MacLean called the brain stem the reptilian brain 1229 00:53:33,510 --> 00:53:35,250 because this is the part of the brain 1230 00:53:35,250 --> 00:53:38,550 that we have in common with virtually all vertebrates. 1231 00:53:38,550 --> 00:53:41,280 Reptiles, mammals, anything with a spinal cord 1232 00:53:41,280 --> 00:53:44,580 will have a brain stem sitting on top of that. 1233 00:53:44,580 --> 00:53:46,410 And from an evolutionary perspective, 1234 00:53:46,410 --> 00:53:49,260 it is the oldest part of the brain, 1235 00:53:49,260 --> 00:53:52,290 and the part that really controls the very most 1236 00:53:52,290 --> 00:53:55,990 primitive functions of life. 1237 00:53:55,990 --> 00:53:58,290 The second part of the brain that MacLean identified is 1238 00:53:58,290 --> 00:54:02,040 the middle part, known as the mid-brain, 1239 00:54:02,040 --> 00:54:05,910 or what MacLean called the mammalian brain. 1240 00:54:05,910 --> 00:54:09,660 This is the part of the brain that we share only 1241 00:54:09,660 --> 00:54:12,630 with warm blooded creatures, with mammals. 1242 00:54:12,630 --> 00:54:15,030 So reptiles don't have this. 1243 00:54:15,030 --> 00:54:18,480 And it's evolutionarily somewhat later in development 1244 00:54:18,480 --> 00:54:20,010 than the reptilian brain. 1245 00:54:20,010 --> 00:54:22,290 And this part of the brain is responsible 1246 00:54:22,290 --> 00:54:26,100 for emotional reaction, for social behavior, 1247 00:54:26,100 --> 00:54:30,930 for fear and greed, love, sexual attraction. 1248 00:54:30,930 --> 00:54:33,717 Now, you might ask, well, how do neuroscientists know this? 1249 00:54:33,717 --> 00:54:35,550 Well, the way that they've discovered this-- 1250 00:54:35,550 --> 00:54:37,860 and it's only been over the last couple of decades-- 1251 00:54:37,860 --> 00:54:40,770 is they take a subject, and they put him in an MRI machine. 1252 00:54:40,770 --> 00:54:42,690 All of you I think know what an MRI is, right? 1253 00:54:42,690 --> 00:54:45,060 Magnetic resonance imaging. 1254 00:54:45,060 --> 00:54:48,300 And instead of taking a snapshot, what neuroscientists 1255 00:54:48,300 --> 00:54:51,300 will do is to take a series of snapshots 1256 00:54:51,300 --> 00:54:56,100 over many, many milliseconds and eventually put together 1257 00:54:56,100 --> 00:54:58,120 a movie of your brain. 1258 00:54:58,120 --> 00:54:59,670 Now, the reason that's interesting 1259 00:54:59,670 --> 00:55:05,970 is because blood contains iron, and iron shows up on an MRI. 1260 00:55:05,970 --> 00:55:07,980 So when they take a movie of your brain, 1261 00:55:07,980 --> 00:55:10,890 they see the blood flowing in your brain. 1262 00:55:10,890 --> 00:55:12,960 And while a subject is in there, they 1263 00:55:12,960 --> 00:55:17,550 will impose certain kinds of stimuli on that subject 1264 00:55:17,550 --> 00:55:19,320 and see where the blood goes. 1265 00:55:19,320 --> 00:55:22,470 So, for example, they ask you to lie in the magnet, 1266 00:55:22,470 --> 00:55:24,510 and you're lying there, and they are 1267 00:55:24,510 --> 00:55:26,010 taking a movie of your brain. 1268 00:55:26,010 --> 00:55:27,840 And while you're lying there peacefully, 1269 00:55:27,840 --> 00:55:30,180 they'll take a needle, and they'll stab you with it. 1270 00:55:30,180 --> 00:55:33,180 And you go, "Ow!" 1271 00:55:33,180 --> 00:55:37,590 And what they'll show is that blood flows to one very 1272 00:55:37,590 --> 00:55:40,400 specific region in the brain. 1273 00:55:40,400 --> 00:55:45,270 That region is the pain center of the brain, or at least 1274 00:55:45,270 --> 00:55:46,740 that's how they identify it. 1275 00:55:46,740 --> 00:55:48,977 So when you feel pain, and they'll 1276 00:55:48,977 --> 00:55:51,060 stab you with lots of different parts of the body, 1277 00:55:51,060 --> 00:55:53,850 and they'll show that it all leads to activation 1278 00:55:53,850 --> 00:55:55,950 of that little pain center. 1279 00:55:55,950 --> 00:55:58,920 So they'll say, fine, pain is identified 1280 00:55:58,920 --> 00:56:01,450 in that particular region. 1281 00:56:01,450 --> 00:56:04,320 Now, the third part of the brain, the last part 1282 00:56:04,320 --> 00:56:06,660 that MacLean identified, is actually 1283 00:56:06,660 --> 00:56:08,010 what we think of as the brain. 1284 00:56:08,010 --> 00:56:10,362 It's that gray matter, the curly coils. 1285 00:56:10,362 --> 00:56:11,820 It turns out that that's not solid, 1286 00:56:11,820 --> 00:56:17,770 but that's like a blanket that sits on top of the mid-brain. 1287 00:56:17,770 --> 00:56:19,980 And that part of the brain is actually 1288 00:56:19,980 --> 00:56:24,700 unique to homo sapiens and certain great apes. 1289 00:56:24,700 --> 00:56:26,947 So this is called the hominid brain, 1290 00:56:26,947 --> 00:56:28,530 and, from an evolutionary perspective, 1291 00:56:28,530 --> 00:56:30,790 it is the newest part of the brain, 1292 00:56:30,790 --> 00:56:33,630 so it's often called the neocortex, the newest 1293 00:56:33,630 --> 00:56:35,820 part of that brain. 1294 00:56:35,820 --> 00:56:39,060 Using MRI techniques, neuroscientists 1295 00:56:39,060 --> 00:56:42,210 have been able to deduce that this part of the brain 1296 00:56:42,210 --> 00:56:47,140 is responsible for language ability, mathematical ability, 1297 00:56:47,140 --> 00:56:48,870 and logical deliberation. 1298 00:56:48,870 --> 00:56:53,170 What we think of as the higher thought functions of humans. 1299 00:56:53,170 --> 00:56:56,902 It resides in that neocortex. 1300 00:56:56,902 --> 00:56:58,360 Now, why am I telling you all this? 1301 00:56:58,360 --> 00:57:00,850 Well, because it turns out that these three components 1302 00:57:00,850 --> 00:57:05,540 of the brain, they end up working in different ways. 1303 00:57:05,540 --> 00:57:07,450 In fact, they have different priorities. 1304 00:57:07,450 --> 00:57:08,420 What do I mean by that? 1305 00:57:08,420 --> 00:57:11,500 Well, let me give you an example. 1306 00:57:11,500 --> 00:57:14,130 Suppose, God forbid, you're in a car accident. 1307 00:57:14,130 --> 00:57:17,940 You're bleeding, and your body starts to shut down. 1308 00:57:17,940 --> 00:57:20,040 You go into shock. 1309 00:57:20,040 --> 00:57:25,300 And, one by one, your organs begin to fail. 1310 00:57:25,300 --> 00:57:28,360 Of the three components of the brain, 1311 00:57:28,360 --> 00:57:31,660 which do you think will shut down last? 1312 00:57:31,660 --> 00:57:34,660 Which of these three components will shut down last? 1313 00:57:34,660 --> 00:57:35,630 Yeah? 1314 00:57:35,630 --> 00:57:36,630 AUDIENCE: The reptilian? 1315 00:57:36,630 --> 00:57:37,080 ANDREW LO: That's right. 1316 00:57:37,080 --> 00:57:37,868 Why? 1317 00:57:37,868 --> 00:57:41,059 AUDIENCE: It's responsible for your basic functions, so-- 1318 00:57:41,059 --> 00:57:41,850 ANDREW LO: Exactly. 1319 00:57:41,850 --> 00:57:44,680 Once your reptilian brain shuts down, you know, that's bye-bye. 1320 00:57:44,680 --> 00:57:45,500 That's the end. 1321 00:57:45,500 --> 00:57:46,350 Right? 1322 00:57:46,350 --> 00:57:50,160 So this explains why you can lose consciousness 1323 00:57:50,160 --> 00:57:51,430 but still breathe. 1324 00:57:51,430 --> 00:57:54,660 It's because consciousness is seated 1325 00:57:54,660 --> 00:57:58,470 in the hominid brain and certain parts of the mammalian brain. 1326 00:57:58,470 --> 00:57:59,970 But you don't need to be conscious 1327 00:57:59,970 --> 00:58:01,520 in order to be breathing. 1328 00:58:01,520 --> 00:58:03,550 Your reptilian brain controls that. 1329 00:58:03,550 --> 00:58:07,860 So from a priority perspective, the most important part 1330 00:58:07,860 --> 00:58:12,600 of your three components is the reptilian brain. 1331 00:58:12,600 --> 00:58:15,420 Now let me ask you the next question. 1332 00:58:15,420 --> 00:58:18,890 Before that, of the remaining two, 1333 00:58:18,890 --> 00:58:21,280 the hominid brain and the mammalian brain, which 1334 00:58:21,280 --> 00:58:22,520 do you think has priority? 1335 00:58:22,520 --> 00:58:27,397 Which shuts down later in the case of a car accident where 1336 00:58:27,397 --> 00:58:28,230 you're bleeding out? 1337 00:58:31,130 --> 00:58:31,752 Yeah? 1338 00:58:31,752 --> 00:58:33,080 AUDIENCE: The mammalian? 1339 00:58:33,080 --> 00:58:33,680 ANDREW LO: That's right. 1340 00:58:33,680 --> 00:58:34,179 Why? 1341 00:58:34,179 --> 00:58:36,770 AUDIENCE: Because that's where fight or flight is? 1342 00:58:36,770 --> 00:58:39,920 ANDREW LO: Fight or flight is in the mammalian brain, 1343 00:58:39,920 --> 00:58:43,626 but why would you think that that's higher priority? 1344 00:58:43,626 --> 00:58:45,562 AUDIENCE: 'Cause there's certain scenarios 1345 00:58:45,562 --> 00:58:48,002 and you want to react to those [INAUDIBLE] 1346 00:58:48,002 --> 00:58:48,710 ANDREW LO: Right. 1347 00:58:48,710 --> 00:58:50,930 From an evolutionary perspective, 1348 00:58:50,930 --> 00:58:52,610 it's probably more important for you 1349 00:58:52,610 --> 00:58:55,490 to be scared and run like hell when 1350 00:58:55,490 --> 00:58:57,950 you're being confronted by a saber tooth tiger 1351 00:58:57,950 --> 00:59:00,492 than for you to be able to solve differential equations. 1352 00:59:00,492 --> 00:59:01,730 Right? 1353 00:59:01,730 --> 00:59:04,550 Even if you're from MIT. 1354 00:59:04,550 --> 00:59:09,140 And so as a result, from a evolutionary standpoint 1355 00:59:09,140 --> 00:59:12,530 and, more importantly, from a physiological standpoint, 1356 00:59:12,530 --> 00:59:17,120 the mammalian brain has priority over the hominid brain. 1357 00:59:17,120 --> 00:59:19,640 Now, neuroscientists have documented 1358 00:59:19,640 --> 00:59:23,900 that this kind of hardwiring is even more significant 1359 00:59:23,900 --> 00:59:25,550 than we had thought. 1360 00:59:25,550 --> 00:59:26,960 Let me explain why. 1361 00:59:26,960 --> 00:59:29,070 They did the following experiment. 1362 00:59:29,070 --> 00:59:30,080 Take a subject. 1363 00:59:30,080 --> 00:59:31,520 Put him in a magnet. 1364 00:59:31,520 --> 00:59:35,894 And they asked the subject to solve arithmetic problems 1365 00:59:35,894 --> 00:59:37,060 while they're in the magnet. 1366 00:59:37,060 --> 00:59:39,185 Now, you can do that because, if you're lying down, 1367 00:59:39,185 --> 00:59:43,010 they have a mirror where they project the image of a computer 1368 00:59:43,010 --> 00:59:44,720 screen, and your hands are free even 1369 00:59:44,720 --> 00:59:46,550 though your head and your body are not. 1370 00:59:46,550 --> 00:59:48,425 And so, while you're lying down in the magnet 1371 00:59:48,425 --> 00:59:49,966 and you're looking up at this mirror, 1372 00:59:49,966 --> 00:59:51,070 you see a computer screen. 1373 00:59:51,070 --> 00:59:52,160 And with your hands-- 1374 00:59:52,160 --> 00:59:54,560 They give you a mouse, and you can move around 1375 00:59:54,560 --> 00:59:57,290 and solve arithmetic prompts on the screen 1376 00:59:57,290 --> 00:59:59,060 while you're in the magnet. 1377 00:59:59,060 --> 01:00:02,780 And arithmetic is seated at this-- 1378 01:00:02,780 --> 01:00:03,860 the hominid brain level. 1379 01:00:03,860 --> 01:00:05,900 So they can see the blood flowing 1380 01:00:05,900 --> 01:00:09,020 into that part of the brain as you solve these arithmetic 1381 01:00:09,020 --> 01:00:10,810 problems. 1382 01:00:10,810 --> 01:00:15,370 While you're doing that, the experimenter will take a needle 1383 01:00:15,370 --> 01:00:18,280 and stab you with it. 1384 01:00:18,280 --> 01:00:19,690 And you go, "Ow!" 1385 01:00:19,690 --> 01:00:22,420 And then it takes you longer to solve those arithmetic 1386 01:00:22,420 --> 01:00:23,752 problems. 1387 01:00:23,752 --> 01:00:25,960 Now you're saying, this is what they give grants for? 1388 01:00:25,960 --> 01:00:29,229 I mean, anybody can tell you that when somebody's 1389 01:00:29,229 --> 01:00:31,270 trying to stab you, it's going to take you longer 1390 01:00:31,270 --> 01:00:32,985 to solve arithmetic problems. 1391 01:00:32,985 --> 01:00:34,300 OK. 1392 01:00:34,300 --> 01:00:36,340 But here's the insight. 1393 01:00:36,340 --> 01:00:38,830 The insight is that what they discovered 1394 01:00:38,830 --> 01:00:44,590 was that the flow of blood was constricted to your neocortex 1395 01:00:44,590 --> 01:00:48,420 after that pain that was implemented, 1396 01:00:48,420 --> 01:00:51,850 but it lasted for hours. 1397 01:00:51,850 --> 01:00:54,910 The restriction of blood flow to your neocortex 1398 01:00:54,910 --> 01:00:58,150 lasted for hours after that pinprick. 1399 01:00:58,150 --> 01:01:02,650 Not just seconds, not just minutes, but hours afterward, 1400 01:01:02,650 --> 01:01:06,420 it took you longer to solve arithmetic problems. 1401 01:01:06,420 --> 01:01:10,320 For all intents and purposes, that pain stimulus 1402 01:01:10,320 --> 01:01:13,320 made you stupider for a period of time. 1403 01:01:13,320 --> 01:01:17,610 You physiologically could not solve those problems 1404 01:01:17,610 --> 01:01:20,640 as quickly as before. 1405 01:01:20,640 --> 01:01:24,840 And this is a stunning result. Because what it says 1406 01:01:24,840 --> 01:01:29,430 is that there are periods where, for purely physiological 1407 01:01:29,430 --> 01:01:33,480 reasons, we are going to be inhibited from using 1408 01:01:33,480 --> 01:01:35,110 that part of the brain. 1409 01:01:35,110 --> 01:01:36,930 I'm going to give you some examples of this 1410 01:01:36,930 --> 01:01:38,850 in your own behavior, just to prove 1411 01:01:38,850 --> 01:01:41,310 to you that this is actually going on right now 1412 01:01:41,310 --> 01:01:43,450 in your heads. 1413 01:01:43,450 --> 01:01:46,000 How many of you have had the following experience? 1414 01:01:46,000 --> 01:01:49,990 You're trying to meet a very attractive individual. 1415 01:01:49,990 --> 01:01:51,850 You want to go out on a date with them. 1416 01:01:51,850 --> 01:01:56,380 So you think of all sorts of really clever ways 1417 01:01:56,380 --> 01:01:59,050 of meeting them by accident. 1418 01:01:59,050 --> 01:02:00,807 And then, when you do that, you're 1419 01:02:00,807 --> 01:02:02,140 going to ask them out on a date. 1420 01:02:02,140 --> 01:02:03,730 And you've got your lines prepared. 1421 01:02:03,730 --> 01:02:04,600 You know exactly what you're going 1422 01:02:04,600 --> 01:02:06,460 to say to make them just completely 1423 01:02:06,460 --> 01:02:09,000 thrilled at going out with you. 1424 01:02:09,000 --> 01:02:10,740 And finally, it starts to unfold. 1425 01:02:10,740 --> 01:02:11,850 It happens. 1426 01:02:11,850 --> 01:02:15,510 You meet the person and, as soon as you open your mouth, 1427 01:02:15,510 --> 01:02:18,450 and you start talking, you sound like a total idiot. 1428 01:02:18,450 --> 01:02:21,210 You say all the wrong things, right? 1429 01:02:21,210 --> 01:02:23,640 You're tongue tied, you start stuttering. 1430 01:02:23,640 --> 01:02:25,739 And there's a reason for this. 1431 01:02:25,739 --> 01:02:27,030 There's a physiological reason. 1432 01:02:27,030 --> 01:02:28,480 You know what it is? 1433 01:02:28,480 --> 01:02:32,790 It's the fact that sexual attraction hyperstimulates 1434 01:02:32,790 --> 01:02:35,730 your mid-brain, here. 1435 01:02:35,730 --> 01:02:38,070 And when it does, it restricts your blood flow 1436 01:02:38,070 --> 01:02:39,330 to the neocortex. 1437 01:02:39,330 --> 01:02:43,410 Language ability is in the neocortex. 1438 01:02:43,410 --> 01:02:44,820 Love makes you stupid. 1439 01:02:48,510 --> 01:02:51,370 Let me give you another example. 1440 01:02:51,370 --> 01:02:54,330 This is an example that we're going to do together, OK? 1441 01:02:54,330 --> 01:02:56,830 The example is going to go like this. 1442 01:02:56,830 --> 01:03:00,120 I'm going to show you a bunch of words 1443 01:03:00,120 --> 01:03:02,230 that have different colors. 1444 01:03:02,230 --> 01:03:04,290 And I don't want you to read anything, 1445 01:03:04,290 --> 01:03:07,860 but I want you to say out loud with me what the colors are, 1446 01:03:07,860 --> 01:03:09,240 from left to right. 1447 01:03:09,240 --> 01:03:10,050 OK? 1448 01:03:10,050 --> 01:03:11,890 So these are the words. 1449 01:03:11,890 --> 01:03:14,700 But we're going to say out loud together, as quickly 1450 01:03:14,700 --> 01:03:17,340 as possible, the colors. 1451 01:03:17,340 --> 01:03:19,346 You guys ready? 1452 01:03:19,346 --> 01:03:19,970 OK, here we go. 1453 01:03:19,970 --> 01:03:23,570 Red, green, blue, yellow, orange, blue, brown, red, 1454 01:03:23,570 --> 01:03:26,630 green, purple, pink, black, blue, yellow, green. 1455 01:03:26,630 --> 01:03:27,500 No problem, right? 1456 01:03:27,500 --> 01:03:29,490 Very easy. 1457 01:03:29,490 --> 01:03:32,680 We do the exact same thing now with this. 1458 01:03:32,680 --> 01:03:36,230 Say the colors as quickly as possible. 1459 01:03:36,230 --> 01:03:37,140 You ready? 1460 01:03:37,140 --> 01:03:38,030 OK, here we go. 1461 01:03:38,030 --> 01:03:40,740 Blue, red, blue, black, orange-- 1462 01:03:40,740 --> 01:03:44,540 [LAUGHTER] 1463 01:03:44,540 --> 01:03:46,440 What happened? 1464 01:03:46,440 --> 01:03:48,340 What's going on? 1465 01:03:48,340 --> 01:03:51,950 Couldn't do it as quickly, could you? 1466 01:03:51,950 --> 01:03:54,980 Well, let me tell you what's going on. 1467 01:03:54,980 --> 01:04:00,140 It turns out that language ability and color recognition 1468 01:04:00,140 --> 01:04:02,600 are actually handled by two different parts of the brain. 1469 01:04:02,600 --> 01:04:05,060 Language ability resides strictly 1470 01:04:05,060 --> 01:04:08,600 in the neocortex, the hominid brain. 1471 01:04:08,600 --> 01:04:11,150 But color recognition is actually 1472 01:04:11,150 --> 01:04:13,010 part of the mammalian brain. 1473 01:04:13,010 --> 01:04:15,999 Mammals can recognize colors very well. 1474 01:04:15,999 --> 01:04:17,540 In fact, I don't know how many of you 1475 01:04:17,540 --> 01:04:21,710 have redecorated your homes, or your kitchens, or dining rooms, 1476 01:04:21,710 --> 01:04:22,550 recently. 1477 01:04:22,550 --> 01:04:24,390 I did this a couple of years ago. 1478 01:04:24,390 --> 01:04:26,270 And my wife talked to one of her friends 1479 01:04:26,270 --> 01:04:28,430 who is a decorator who told her, you've 1480 01:04:28,430 --> 01:04:31,550 got to paint your dining room in red. 1481 01:04:31,550 --> 01:04:32,660 Because-- Why? 1482 01:04:32,660 --> 01:04:34,100 Anybody know? 1483 01:04:34,100 --> 01:04:37,230 Why is red the preferred color of restaurants, dining rooms? 1484 01:04:37,230 --> 01:04:37,730 Yeah? 1485 01:04:37,730 --> 01:04:38,720 AUDIENCE: Makes you eat more. 1486 01:04:38,720 --> 01:04:39,230 ANDREW LO: Exactly. 1487 01:04:39,230 --> 01:04:40,021 Makes you eat more. 1488 01:04:40,021 --> 01:04:43,440 It's a more appetizing color. 1489 01:04:43,440 --> 01:04:46,012 Anybody ever wonder why that is? 1490 01:04:46,012 --> 01:04:47,970 Do you know why red is a more appetizing color? 1491 01:04:47,970 --> 01:04:50,370 I hate to tell you because you may not be appetized anymore. 1492 01:04:50,370 --> 01:04:51,360 AUDIENCE: Because of the color? 1493 01:04:51,360 --> 01:04:52,818 ANDREW LO: It's the color of blood. 1494 01:04:52,818 --> 01:04:54,450 Because it's hardwired into us. 1495 01:04:54,450 --> 01:04:57,074 When we see blood, it's dinnertime. 1496 01:04:57,074 --> 01:04:59,722 [LAUGHTER] 1497 01:04:59,722 --> 01:05:03,240 And that is hardwired. 1498 01:05:03,240 --> 01:05:04,294 It's true. 1499 01:05:04,294 --> 01:05:06,210 I mean, I don't know how many of you have ever 1500 01:05:06,210 --> 01:05:07,500 gone to The Blue Room. 1501 01:05:07,500 --> 01:05:09,420 That's a restaurant down the street. 1502 01:05:09,420 --> 01:05:11,010 I've actually never been there, and I 1503 01:05:11,010 --> 01:05:13,470 have to confess the reason is that the idea 1504 01:05:13,470 --> 01:05:16,950 of a blue room for dinner is not very appetizing to me. 1505 01:05:16,950 --> 01:05:20,520 Blue is not a color you want to see in your food. 1506 01:05:20,520 --> 01:05:23,190 So color recognition is a deep-seated part 1507 01:05:23,190 --> 01:05:24,630 of our mammalian brain. 1508 01:05:24,630 --> 01:05:26,640 But language ability is not. 1509 01:05:26,640 --> 01:05:29,550 Language ability is a different part of the brain. 1510 01:05:29,550 --> 01:05:34,290 In this case, where the colors and the words don't match, 1511 01:05:34,290 --> 01:05:36,990 it takes your brain a split second longer 1512 01:05:36,990 --> 01:05:39,510 to adjudicate that conflict. 1513 01:05:39,510 --> 01:05:41,790 But in this case, when they're the same, 1514 01:05:41,790 --> 01:05:43,590 you can rattle it off like this. 1515 01:05:43,590 --> 01:05:44,720 [SNAPS FINGERS] 1516 01:05:44,720 --> 01:05:47,470 So let me bring this now all back to finance, 1517 01:05:47,470 --> 01:05:49,502 as I told you I would. 1518 01:05:49,502 --> 01:05:50,710 What's the point of all this? 1519 01:05:50,710 --> 01:05:54,010 The point is that what we take to be 1520 01:05:54,010 --> 01:05:57,400 human preferences for decision-making 1521 01:05:57,400 --> 01:06:00,580 is not a mathematical, immutable object. 1522 01:06:00,580 --> 01:06:03,370 But, in fact, preferences are the outcome 1523 01:06:03,370 --> 01:06:06,460 of interactions between the mammalian brain 1524 01:06:06,460 --> 01:06:08,260 and the hominid brain. 1525 01:06:08,260 --> 01:06:14,250 And what we think of as basic decision-making 1526 01:06:14,250 --> 01:06:18,310 requires the proper balance of the two. 1527 01:06:18,310 --> 01:06:24,150 So, in other words, rationality is not logical deliberation. 1528 01:06:24,150 --> 01:06:26,760 The patient, Elliot, was a clear counterexample. 1529 01:06:26,760 --> 01:06:31,055 Elliot was very logical, but he was not rational. 1530 01:06:31,055 --> 01:06:35,730 Emotion is a critical component of rationality. 1531 01:06:35,730 --> 01:06:39,290 But too much emotion, and you get stupid, 1532 01:06:39,290 --> 01:06:43,160 as the case with falling in love or in the case 1533 01:06:43,160 --> 01:06:45,920 of overwhelming emotional stimulus 1534 01:06:45,920 --> 01:06:50,060 cutting off blood flow to your neocortex. 1535 01:06:50,060 --> 01:06:54,331 This, by the way, will lead to some very useful self-help kind 1536 01:06:54,331 --> 01:06:54,830 of advice. 1537 01:06:54,830 --> 01:06:56,630 Let me give you an example. 1538 01:06:56,630 --> 01:06:59,480 Those of you who've ever been involved in corporate politics, 1539 01:06:59,480 --> 01:07:01,610 back office politics-- 1540 01:07:01,610 --> 01:07:04,820 one of the things that you should keep in mind 1541 01:07:04,820 --> 01:07:09,170 is that, when you are emotionally hyper-stimulated, 1542 01:07:09,170 --> 01:07:11,476 you will not be able to think straight. 1543 01:07:11,476 --> 01:07:13,850 How many of you've ever heard the phrase, I was so angry, 1544 01:07:13,850 --> 01:07:15,509 I could barely speak? 1545 01:07:15,509 --> 01:07:16,550 You've heard that, right? 1546 01:07:16,550 --> 01:07:17,930 You've probably felt that. 1547 01:07:17,930 --> 01:07:21,050 Well, that's not just a metaphor. 1548 01:07:21,050 --> 01:07:22,820 That's actual physiologically true. 1549 01:07:22,820 --> 01:07:26,120 When you are so angry, it will cut off the flow of blood 1550 01:07:26,120 --> 01:07:28,610 to your neocortex, and you will not be able to speak. 1551 01:07:28,610 --> 01:07:31,850 You will not be able to express yourself as articulately. 1552 01:07:31,850 --> 01:07:34,100 So the next time somebody goes into your office, 1553 01:07:34,100 --> 01:07:36,770 or your cubicle, or your company, 1554 01:07:36,770 --> 01:07:40,040 and says something to you that makes you so mad 1555 01:07:40,040 --> 01:07:45,000 that you want to strangle them, and you can't speak, 1556 01:07:45,000 --> 01:07:48,210 the best thing to do is not to speak. 1557 01:07:48,210 --> 01:07:50,060 It's to say, "Thank you very much. 1558 01:07:50,060 --> 01:07:51,340 That was very insightful. 1559 01:07:51,340 --> 01:07:53,680 Let me get back to you in a week or so." 1560 01:07:53,680 --> 01:07:56,440 Because if you try to talk when you are emotionally 1561 01:07:56,440 --> 01:07:58,510 hyper-stimulated, you will undoubtedly 1562 01:07:58,510 --> 01:08:00,610 say stupid things, things you will regret 1563 01:08:00,610 --> 01:08:03,220 for possibly years later. 1564 01:08:03,220 --> 01:08:07,120 And this is a direct outcome of this neurophysiological 1565 01:08:07,120 --> 01:08:08,560 research. 1566 01:08:08,560 --> 01:08:12,070 Our preferences are the sum total of these interactions. 1567 01:08:12,070 --> 01:08:14,740 Therefore, they are not stable over time, 1568 01:08:14,740 --> 01:08:17,080 nor are they stable over circumstances. 1569 01:08:17,080 --> 01:08:17,649 All right? 1570 01:08:17,649 --> 01:08:21,100 You are not the same person that you were 10 years ago. 1571 01:08:21,100 --> 01:08:24,880 And many of you who are going to be facing losses of one kind 1572 01:08:24,880 --> 01:08:28,149 or another-- you lose a parent, you lose a spouse or child-- 1573 01:08:28,149 --> 01:08:30,970 you will not be the same person after that event as before. 1574 01:08:30,970 --> 01:08:33,250 It will change you because it will 1575 01:08:33,250 --> 01:08:36,729 change the balance between your hominid brain 1576 01:08:36,729 --> 01:08:38,290 and your mammalian brain. 1577 01:08:38,290 --> 01:08:40,750 And unless you bring those back into balance, 1578 01:08:40,750 --> 01:08:43,250 you're going to be making decisions differently. 1579 01:08:43,250 --> 01:08:45,100 So if there's any advice that I have 1580 01:08:45,100 --> 01:08:47,640 for you about how to take this information and use it, 1581 01:08:47,640 --> 01:08:50,710 it's to keep your eyes on the objective. 1582 01:08:50,710 --> 01:08:56,430 In other words, decide what you want to achieve and then ask 1583 01:08:56,430 --> 01:08:59,340 yourself at every step of the way, 1584 01:08:59,340 --> 01:09:04,290 "Is my current actions about to help or hinder 1585 01:09:04,290 --> 01:09:05,729 those objectives?" 1586 01:09:05,729 --> 01:09:08,609 You know, if you're driving home from work one day, 1587 01:09:08,609 --> 01:09:10,770 and somebody cuts you off, and you 1588 01:09:10,770 --> 01:09:13,319 want to give them the finger and ram your car into them, 1589 01:09:13,319 --> 01:09:15,720 ask yourself, will that help you achieve 1590 01:09:15,720 --> 01:09:20,250 your objective of getting home without an incident or not? 1591 01:09:20,250 --> 01:09:22,439 If it doesn't, you probably shouldn't do it, 1592 01:09:22,439 --> 01:09:25,080 even though it'd feel really good to do that. 1593 01:09:25,080 --> 01:09:26,250 OK? 1594 01:09:26,250 --> 01:09:30,029 Now, let me tell you about the adaptive markets hypothesis 1595 01:09:30,029 --> 01:09:33,040 and how I'm going to put it all together for you. 1596 01:09:33,040 --> 01:09:36,189 Behavioral finance and rational finance. 1597 01:09:36,189 --> 01:09:39,640 They are both correct, and they're both incorrect. 1598 01:09:39,640 --> 01:09:41,529 The reason that they're both correct 1599 01:09:41,529 --> 01:09:46,720 is that they both apply to certain circumstances. 1600 01:09:46,720 --> 01:09:48,760 What I taught you in this course up until now, 1601 01:09:48,760 --> 01:09:52,029 what we've learned together, is the rational finance 1602 01:09:52,029 --> 01:09:54,760 that applies most of the time when 1603 01:09:54,760 --> 01:09:56,980 your mid-brain and your neocortex 1604 01:09:56,980 --> 01:09:58,750 are properly balanced. 1605 01:09:58,750 --> 01:10:01,570 But during extreme periods of distress, 1606 01:10:01,570 --> 01:10:04,960 when you are emotionally either overwhelmed positive 1607 01:10:04,960 --> 01:10:08,950 or overwhelmed negative, in either of those circumstances, 1608 01:10:08,950 --> 01:10:13,660 you will not behave in ways that we would consider rational. 1609 01:10:13,660 --> 01:10:17,080 Those are the moments when behavioral finance applies 1610 01:10:17,080 --> 01:10:19,300 or, at least, when the anomalies arise. 1611 01:10:19,300 --> 01:10:21,220 Behavioral finance is not a theory, 1612 01:10:21,220 --> 01:10:24,040 it's simply a collection of anomalies. 1613 01:10:24,040 --> 01:10:26,830 The adaptive markets hypothesis is this theory 1614 01:10:26,830 --> 01:10:29,230 that I've been developing for the last several years that 1615 01:10:29,230 --> 01:10:31,030 tries to put these two together. 1616 01:10:31,030 --> 01:10:33,070 And the way to do that is by recognizing 1617 01:10:33,070 --> 01:10:36,370 that we are both creatures of our mammalian brain 1618 01:10:36,370 --> 01:10:38,680 and creatures of our neocortex. 1619 01:10:38,680 --> 01:10:42,160 Depending on market conditions, environmental conditions, 1620 01:10:42,160 --> 01:10:44,810 and our own decision-making process, 1621 01:10:44,810 --> 01:10:48,730 we may or may not be in the rational camp 1622 01:10:48,730 --> 01:10:52,000 or the behaviorial camp at any point in time. 1623 01:10:52,000 --> 01:10:54,490 And when you look at the market as a whole, an aggregate 1624 01:10:54,490 --> 01:10:56,380 over all of these individuals, you 1625 01:10:56,380 --> 01:10:58,540 will be able to get a sense of whether or not 1626 01:10:58,540 --> 01:11:00,670 you can trust market prices. 1627 01:11:00,670 --> 01:11:03,689 At the beginning of the course, as this crisis was unfolding, 1628 01:11:03,689 --> 01:11:05,480 I told you very clearly that finance theory 1629 01:11:05,480 --> 01:11:08,050 is going to be taking a several week vacation. 1630 01:11:08,050 --> 01:11:11,200 What I was getting at is that, right now, all of us 1631 01:11:11,200 --> 01:11:14,350 are being hyper-stimulated in our mid-brain. 1632 01:11:14,350 --> 01:11:17,470 We are all scared to death as to what we're going to read about. 1633 01:11:17,470 --> 01:11:19,930 There was a time during this course where 1634 01:11:19,930 --> 01:11:22,106 I was really not looking forward to the weekends 1635 01:11:22,106 --> 01:11:24,730 because something always seemed to be going on on the weekends. 1636 01:11:24,730 --> 01:11:27,313 And Monday I'd have to tell you, well, gee, this bank is gone, 1637 01:11:27,313 --> 01:11:29,140 and that investment bank is no more. 1638 01:11:29,140 --> 01:11:31,720 And it was a very stressful period. 1639 01:11:31,720 --> 01:11:33,450 It still is. 1640 01:11:33,450 --> 01:11:36,130 Decision-making is going to be affected by that. 1641 01:11:36,130 --> 01:11:39,300 So we developed heuristics, rules of thumb, 1642 01:11:39,300 --> 01:11:45,090 to balance out our neocortex and our mammalian brain. 1643 01:11:45,090 --> 01:11:49,320 We come up with, not theoretical optimum optimization 1644 01:11:49,320 --> 01:11:52,030 algorithms, but we come up with rules of thumb. 1645 01:11:52,030 --> 01:11:55,560 And the idea behind the adaptive markets hypothesis 1646 01:11:55,560 --> 01:11:59,460 is that these rules of thumb evolve over time as they either 1647 01:11:59,460 --> 01:12:01,770 succeed or they fail. 1648 01:12:01,770 --> 01:12:05,430 So the economics that we learn and teach you in class 1649 01:12:05,430 --> 01:12:09,390 is meant to be an approximation to a much more complex reality. 1650 01:12:09,390 --> 01:12:11,820 Sometimes that approximation works, 1651 01:12:11,820 --> 01:12:14,010 sometimes it doesn't, and the only way 1652 01:12:14,010 --> 01:12:16,890 to really understand which it is, 1653 01:12:16,890 --> 01:12:19,890 is to try to get a sense of how this evolutionary process 1654 01:12:19,890 --> 01:12:20,760 works. 1655 01:12:20,760 --> 01:12:23,640 In particular, where do these various different kinds 1656 01:12:23,640 --> 01:12:26,530 of rules of thumb come from? 1657 01:12:26,530 --> 01:12:30,550 And to do that, I want to tell you about one last thing, 1658 01:12:30,550 --> 01:12:32,020 and then we're going to conclude, 1659 01:12:32,020 --> 01:12:35,500 which is that evolutionary forces are really 1660 01:12:35,500 --> 01:12:39,040 what drives the success or failure 1661 01:12:39,040 --> 01:12:41,260 of these rules of thumb. 1662 01:12:41,260 --> 01:12:44,080 And I want to give you an example of these kinds of rules 1663 01:12:44,080 --> 01:12:45,910 of thumb and how adaptation works. 1664 01:12:45,910 --> 01:12:47,860 And to do that, I'm going to tell you 1665 01:12:47,860 --> 01:12:52,510 about a personal example that has to do with a problem 1666 01:12:52,510 --> 01:12:54,370 that I solve every day. 1667 01:12:54,370 --> 01:12:57,740 And it's a problem that I think you have as well. 1668 01:12:57,740 --> 01:13:00,490 I'm going to tell you about how my heuristic developed 1669 01:13:00,490 --> 01:13:02,050 over the years. 1670 01:13:02,050 --> 01:13:03,850 The problem that I'm going to talk about 1671 01:13:03,850 --> 01:13:05,990 is the problem of getting dressed in the morning. 1672 01:13:05,990 --> 01:13:06,490 OK? 1673 01:13:06,490 --> 01:13:08,860 Every morning, all of us have to get dressed. 1674 01:13:08,860 --> 01:13:11,440 And to give you a sense of the challenges 1675 01:13:11,440 --> 01:13:14,600 that I face in my process of getting dressed, 1676 01:13:14,600 --> 01:13:16,360 I have to tell you about my wardrobe. 1677 01:13:16,360 --> 01:13:18,340 So here's my wardrobe. 1678 01:13:18,340 --> 01:13:20,500 I've got five jackets, 10 pairs of pants, 1679 01:13:20,500 --> 01:13:23,290 20 ties, 10 shirts, 10 pairs of socks, four pairs of shoes, 1680 01:13:23,290 --> 01:13:25,730 and five belts. 1681 01:13:25,730 --> 01:13:28,520 Now, you might argue that's a rather limited wardrobe. 1682 01:13:28,520 --> 01:13:33,020 However, if you took the time to compute the combinatorics, 1683 01:13:33,020 --> 01:13:36,800 you would see that I have two million unique outfits 1684 01:13:36,800 --> 01:13:38,420 in my wardrobe. 1685 01:13:38,420 --> 01:13:42,530 Now, granted, not all of them are of the same fashion 1686 01:13:42,530 --> 01:13:43,610 content. 1687 01:13:43,610 --> 01:13:45,560 They're not all equally compelling 1688 01:13:45,560 --> 01:13:47,660 from the fashion perspective, so I've 1689 01:13:47,660 --> 01:13:48,890 got an optimization problem. 1690 01:13:48,890 --> 01:13:51,290 I've got to figure out which is the right outfit 1691 01:13:51,290 --> 01:13:53,040 for the right occasion. 1692 01:13:53,040 --> 01:13:56,390 Suppose it takes me one second to evaluate the fashion 1693 01:13:56,390 --> 01:13:59,250 content of an outfit. 1694 01:13:59,250 --> 01:14:02,130 How long would it take me to get dressed every day? 1695 01:14:02,130 --> 01:14:04,350 Well, you do the math. 1696 01:14:04,350 --> 01:14:07,674 Turns out that it would take me 23.1 days to get dressed. 1697 01:14:07,674 --> 01:14:09,840 Now, clearly, I don't take that long to get dressed, 1698 01:14:09,840 --> 01:14:11,250 so how do I do it? 1699 01:14:11,250 --> 01:14:14,370 Do I have some superhuman optimization algorithm up 1700 01:14:14,370 --> 01:14:16,800 my sleeve? 1701 01:14:16,800 --> 01:14:17,820 No. 1702 01:14:17,820 --> 01:14:20,427 It turns out I use a heuristic, a rule of thumb. 1703 01:14:20,427 --> 01:14:23,010 And you might say, well, where did you get that rule of thumb? 1704 01:14:23,010 --> 01:14:24,930 I'll tell you. 1705 01:14:24,930 --> 01:14:27,600 When I was six years old, growing up in Queens, 1706 01:14:27,600 --> 01:14:31,050 New York, at that time the superhero of the day 1707 01:14:31,050 --> 01:14:32,820 was Superman. 1708 01:14:32,820 --> 01:14:36,747 Everybody watched Superman on TV, comic books, and so on. 1709 01:14:36,747 --> 01:14:38,580 And some clever marketing genius figured out 1710 01:14:38,580 --> 01:14:41,370 that if you put a Superman emblem on a jacket, 1711 01:14:41,370 --> 01:14:44,520 you could sell a lot of them to six-year-old kids. 1712 01:14:44,520 --> 01:14:48,070 And growing up in a single parent household, 1713 01:14:48,070 --> 01:14:50,190 we didn't have a lot of extra cash, 1714 01:14:50,190 --> 01:14:54,570 so it took me weeks to nag my mother to get me this jacket. 1715 01:14:54,570 --> 01:14:56,850 Weeks and weeks of daily nagging. 1716 01:14:56,850 --> 01:14:58,840 And finally, after weeks of nagging, 1717 01:14:58,840 --> 01:15:02,560 she relented, and she agreed to buy me this jacket. 1718 01:15:02,560 --> 01:15:03,960 And I remember vividly, you know, 1719 01:15:03,960 --> 01:15:06,710 Friday afternoon after school, she got home from work. 1720 01:15:06,710 --> 01:15:09,239 I waited for her, we went to Alexander's, bought the jacket. 1721 01:15:09,239 --> 01:15:11,280 I spent the whole weekend wearing this thing, day 1722 01:15:11,280 --> 01:15:11,780 and night. 1723 01:15:11,780 --> 01:15:13,500 It was wonderful. 1724 01:15:13,500 --> 01:15:17,556 And Monday morning, I got up extra early, really excited 1725 01:15:17,556 --> 01:15:18,930 to go to school with this jacket. 1726 01:15:18,930 --> 01:15:20,888 I looked in the mirror, did all my action poses 1727 01:15:20,888 --> 01:15:23,220 and looked at myself. 1728 01:15:23,220 --> 01:15:26,130 And by the time I was done with that, 1729 01:15:26,130 --> 01:15:28,710 I was 15 minutes late for school, 1730 01:15:28,710 --> 01:15:32,650 and I needed a note from my mother to get into class. 1731 01:15:32,650 --> 01:15:35,490 And I remember vividly, walking into the classroom, 1732 01:15:35,490 --> 01:15:37,620 going up to the front of the room, 1733 01:15:37,620 --> 01:15:40,110 giving the teacher the note, walking back 1734 01:15:40,110 --> 01:15:42,450 to my seat, all the other kids already sitting down, 1735 01:15:42,450 --> 01:15:44,520 reading, snickering at me. 1736 01:15:44,520 --> 01:15:47,010 And I was completely mortified. 1737 01:15:47,010 --> 01:15:49,650 And you know I must be mortified because, 42 years later, I 1738 01:15:49,650 --> 01:15:52,710 still remember this exactly. 1739 01:15:52,710 --> 01:15:53,700 And you know what? 1740 01:15:53,700 --> 01:15:57,900 From that day on, it never took me more than five minutes 1741 01:15:57,900 --> 01:15:59,130 to get dressed. 1742 01:15:59,130 --> 01:15:59,970 Ever. 1743 01:15:59,970 --> 01:16:02,250 Never. 1744 01:16:02,250 --> 01:16:07,650 So I solved my problem based upon very strong, 1745 01:16:07,650 --> 01:16:10,020 negative feedback. 1746 01:16:10,020 --> 01:16:11,274 You see how it works? 1747 01:16:11,274 --> 01:16:13,690 I have a new heuristic for getting dressed in the morning, 1748 01:16:13,690 --> 01:16:17,100 but that new heuristic only came about after some very 1749 01:16:17,100 --> 01:16:20,270 strong, negative emotion. 1750 01:16:20,270 --> 01:16:22,490 If I was like Elliot, and I never 1751 01:16:22,490 --> 01:16:25,800 had any negative feedback, I'd go 1752 01:16:25,800 --> 01:16:29,070 on taking hours to get dressed every morning 1753 01:16:29,070 --> 01:16:32,190 instead of getting dressed in five minutes, like I do now. 1754 01:16:32,190 --> 01:16:36,429 Now, somebody else, let's say a movie star like Tom Cruise. 1755 01:16:36,429 --> 01:16:38,970 I suspect that Tom Cruise spends a lot more than five minutes 1756 01:16:38,970 --> 01:16:40,620 getting dressed in the morning. 1757 01:16:40,620 --> 01:16:43,110 In fact, he probably spends more time coiffing his hair 1758 01:16:43,110 --> 01:16:46,310 than I spend coiffing my hair. 1759 01:16:46,310 --> 01:16:48,360 And there's a reason for that. 1760 01:16:48,360 --> 01:16:50,790 For him, it makes sense. 1761 01:16:50,790 --> 01:16:54,510 That heuristic for him makes sense in his context, 1762 01:16:54,510 --> 01:16:56,760 not in my context. 1763 01:16:56,760 --> 01:17:00,620 All of us, we engage in these heuristics 1764 01:17:00,620 --> 01:17:04,890 that are adaptive to the particular situation at hand. 1765 01:17:04,890 --> 01:17:06,570 So the adaptive markets hypothesis 1766 01:17:06,570 --> 01:17:08,610 acknowledges that people make mistakes, 1767 01:17:08,610 --> 01:17:11,610 but they learn, and they adapt, assuming 1768 01:17:11,610 --> 01:17:16,410 there's balance between emotion and logical deliberation. 1769 01:17:16,410 --> 01:17:20,400 So there are a number of very practical implications 1770 01:17:20,400 --> 01:17:25,460 for this, some of which suggest that the CAPM does not 1771 01:17:25,460 --> 01:17:29,540 work and should not work, except during certain periods 1772 01:17:29,540 --> 01:17:30,870 of the market. 1773 01:17:30,870 --> 01:17:34,940 So what we've learned together in this course 1774 01:17:34,940 --> 01:17:40,040 describes 80% to 90% of what happens in markets. 1775 01:17:40,040 --> 01:17:44,450 But there's another 10% or 20% of extreme and unusual market 1776 01:17:44,450 --> 01:17:48,260 conditions that makes this field a lot more challenging than you 1777 01:17:48,260 --> 01:17:50,060 might otherwise think. 1778 01:17:50,060 --> 01:17:52,490 And that's the part that I want you to keep in mind. 1779 01:17:52,490 --> 01:17:55,520 We're not going to talk about that anymore in this course, 1780 01:17:55,520 --> 01:17:59,480 or in 402, or in 433, or in 434, or 437. 1781 01:17:59,480 --> 01:18:02,210 We don't talk about the messy part of finance 1782 01:18:02,210 --> 01:18:04,230 because it's still currently being developed. 1783 01:18:04,230 --> 01:18:05,810 In fact, as I told you at the very beginning, 1784 01:18:05,810 --> 01:18:07,351 you're not going to find any textbook 1785 01:18:07,351 --> 01:18:09,680 versions of the adaptive markets hypothesis. 1786 01:18:09,680 --> 01:18:11,721 The only person you'll ever find talking about it 1787 01:18:11,721 --> 01:18:14,330 right now is me, unfortunately. 1788 01:18:14,330 --> 01:18:15,990 Hopefully, that will change over time, 1789 01:18:15,990 --> 01:18:17,630 but this is still under development. 1790 01:18:17,630 --> 01:18:21,410 But just keep in mind that the theories that you've learned 1791 01:18:21,410 --> 01:18:24,620 apply not all of the time. 1792 01:18:24,620 --> 01:18:27,590 Most of the time, they apply and they're useful, 1793 01:18:27,590 --> 01:18:31,040 but they are an approximation to a much more complex reality. 1794 01:18:31,040 --> 01:18:33,590 And that reality will incorporate 1795 01:18:33,590 --> 01:18:37,220 lots of other phenomenon which we described here 1796 01:18:37,220 --> 01:18:41,020 and which I'll talk a bit more about on Monday. 1797 01:18:41,020 --> 01:18:41,970 OK. 1798 01:18:41,970 --> 01:18:44,620 So let me stop here since we're just about out of time. 1799 01:18:44,620 --> 01:18:47,151 Any questions? 1800 01:18:47,151 --> 01:18:47,650 No? 1801 01:18:47,650 --> 01:18:48,380 Yes? 1802 01:18:48,380 --> 01:18:52,980 AUDIENCE: So I was wondering if you might think, for example, 1803 01:18:52,980 --> 01:18:56,500 if I have an exam, I tend to notice 1804 01:18:56,500 --> 01:19:00,560 that I will be way more productive at learning way 1805 01:19:00,560 --> 01:19:03,508 before two days before. 1806 01:19:03,508 --> 01:19:07,950 And I cannot see a conflict because I was feeling 1807 01:19:07,950 --> 01:19:10,100 the pressure in handling my emotions. 1808 01:19:10,100 --> 01:19:12,860 But how is it possible that I feel the emotion but, 1809 01:19:12,860 --> 01:19:14,644 at the same time, I-- 1810 01:19:14,644 --> 01:19:16,310 ANDREW LO: That is a fantastic question. 1811 01:19:16,310 --> 01:19:17,240 Let me repeat the question. 1812 01:19:17,240 --> 01:19:18,823 I'm not going to answer today, but I'm 1813 01:19:18,823 --> 01:19:20,780 going to talk about it on Monday, because it's 1814 01:19:20,780 --> 01:19:24,710 both a great question in the context of this theory, 1815 01:19:24,710 --> 01:19:26,370 but it's also very apropos, so I'm 1816 01:19:26,370 --> 01:19:28,760 going to be talking about the final exam next Monday. 1817 01:19:28,760 --> 01:19:29,510 OK? 1818 01:19:29,510 --> 01:19:32,134 The question is, when you were studying for an exam, 1819 01:19:32,134 --> 01:19:33,050 you're under pressure. 1820 01:19:33,050 --> 01:19:35,150 You're feeling a lot of emotional stress, 1821 01:19:35,150 --> 01:19:39,050 and yet you seem to learn better, faster, 1822 01:19:39,050 --> 01:19:41,580 during those periods of time. 1823 01:19:41,580 --> 01:19:45,410 It turns out that there is a physiological and psychological 1824 01:19:45,410 --> 01:19:48,530 basis when a certain amount of pressure 1825 01:19:48,530 --> 01:19:50,149 is actually a good thing. 1826 01:19:50,149 --> 01:19:52,190 But the question is, what is that certain amount? 1827 01:19:52,190 --> 01:19:55,367 Any more than that, and then you'd be a basket case. 1828 01:19:55,367 --> 01:19:57,200 So we're going to talk about that on Monday. 1829 01:19:57,200 --> 01:19:57,700 All right? 1830 01:19:57,700 --> 01:19:59,110 Thanks.