1 00:00:09,290 --> 00:00:10,590 PATRICK WINSTON: Well, don't stop. 2 00:00:10,590 --> 00:00:12,710 Shoot. 3 00:00:12,710 --> 00:00:15,280 I guess we've got to stop. 4 00:00:15,280 --> 00:00:17,310 I will soon go into withdrawal symptoms that will 5 00:00:17,310 --> 00:00:19,890 last about six weeks. 6 00:00:19,890 --> 00:00:22,570 But, on the other hand, we all are beginning to develop a 7 00:00:22,570 --> 00:00:25,270 sort of tired and desperate look. 8 00:00:25,270 --> 00:00:29,500 And perhaps it's a good thing to get the semester behind us 9 00:00:29,500 --> 00:00:32,311 and go into solstice hibernation. 10 00:00:32,311 --> 00:00:34,250 Anyway, there's a lot to do today. 11 00:00:34,250 --> 00:00:36,490 I want to wrap up a couple things, talk about what's 12 00:00:36,490 --> 00:00:39,720 next, and maybe get into some big issues, perhaps a little 13 00:00:39,720 --> 00:00:41,990 Genesis demo, that sort of thing. 14 00:00:41,990 --> 00:00:44,150 So here's what we're going to do first. 15 00:00:44,150 --> 00:00:46,730 Last time, I talked about this whole idea 16 00:00:46,730 --> 00:00:48,440 of structure discovery. 17 00:00:48,440 --> 00:00:51,180 And really, the whole reason I cracked and started talking 18 00:00:51,180 --> 00:00:55,620 about basic methods is because of the potential utility of 19 00:00:55,620 --> 00:00:58,790 taking that idea one step further and finding structure 20 00:00:58,790 --> 00:01:03,170 in situations where you might not otherwise find it. 21 00:01:03,170 --> 00:01:05,060 It's still an open question about whether that's best way 22 00:01:05,060 --> 00:01:05,740 to think about it. 23 00:01:05,740 --> 00:01:06,990 But here it goes. 24 00:01:09,690 --> 00:01:12,039 Imagine you've got a couple of stories. 25 00:01:12,039 --> 00:01:15,630 And these circles represent the events in the story. 26 00:01:15,630 --> 00:01:18,320 And now what you'd like to get out of these stories is some 27 00:01:18,320 --> 00:01:24,080 kind of finite state graph that describes the collection 28 00:01:24,080 --> 00:01:25,760 of stories. 29 00:01:25,760 --> 00:01:30,350 So you might discover, for example, that these two events 30 00:01:30,350 --> 00:01:32,200 are quite similar. 31 00:01:32,200 --> 00:01:35,940 And these two events are quite similar. 32 00:01:35,940 --> 00:01:39,420 So you might use that as a basis for speculating that 33 00:01:39,420 --> 00:01:44,080 maybe a more compact way of representing the stuff in the 34 00:01:44,080 --> 00:01:49,020 story would look like this. 35 00:01:49,020 --> 00:01:52,690 Where this one-- 36 00:01:52,690 --> 00:01:53,030 let's see. 37 00:01:53,030 --> 00:01:57,800 This one goes with this one, this one goes with this one 38 00:01:57,800 --> 00:02:01,090 and there's a possibility of another state in between. 39 00:02:01,090 --> 00:02:05,090 So that's the notion of Bayesian story merging. 40 00:02:05,090 --> 00:02:06,810 Now I'd like to show you a little bit more convincing 41 00:02:06,810 --> 00:02:08,830 demonstration of that. 42 00:02:08,830 --> 00:02:10,080 Here's how it goes. 43 00:02:16,765 --> 00:02:18,290 So there are the two stories. 44 00:02:18,290 --> 00:02:21,406 This is just a classroom demonstration, no big deal. 45 00:02:21,406 --> 00:02:23,530 But you can see there's a sort of parallel 46 00:02:23,530 --> 00:02:25,520 structure in there. 47 00:02:25,520 --> 00:02:27,995 So this is the work of a graduate student, Mark 48 00:02:27,995 --> 00:02:33,050 Finlayson, who processed those stories to produce those kinds 49 00:02:33,050 --> 00:02:36,790 of events and those kinds of events that get assembled into 50 00:02:36,790 --> 00:02:39,600 two story graphs. 51 00:02:39,600 --> 00:02:43,370 And the question is, is the most probable way of 52 00:02:43,370 --> 00:02:45,640 explaining that corpus of stories? 53 00:02:45,640 --> 00:02:48,140 And of course, the answer is no. 54 00:02:48,140 --> 00:02:55,680 If you merge some things like chase and stop, then you get a 55 00:02:55,680 --> 00:02:58,980 simpler graph, one that is more probable in the same 56 00:02:58,980 --> 00:03:01,290 sense that we discussed last time. 57 00:03:01,290 --> 00:03:03,860 Then you can merge run and flee because they're similar 58 00:03:03,860 --> 00:03:05,880 kinds of events. 59 00:03:05,880 --> 00:03:08,460 And finally, you've got think and decide. 60 00:03:08,460 --> 00:03:09,980 Boom. 61 00:03:09,980 --> 00:03:13,310 There is your story graph. 62 00:03:13,310 --> 00:03:16,590 And this is the same idea taken several levels higher 63 00:03:16,590 --> 00:03:19,180 that produce the capacity to discover, in these two 64 00:03:19,180 --> 00:03:23,180 stories, the concept of revenge, as promised at the 65 00:03:23,180 --> 00:03:26,860 beginning of our last discussion. 66 00:03:26,860 --> 00:03:29,630 So sometimes the Bayesian stuff is the right thing to 67 00:03:29,630 --> 00:03:31,960 do, especially if you don't know anything. 68 00:03:31,960 --> 00:03:34,370 But sometimes you do know stuff. 69 00:03:34,370 --> 00:03:37,320 And when you do know stuff it's possible that you can do 70 00:03:37,320 --> 00:03:39,560 something very much more efficient. 71 00:03:39,560 --> 00:03:42,200 This sort of thing takes clouds of 72 00:03:42,200 --> 00:03:45,130 computers to process. 73 00:03:45,130 --> 00:03:47,670 But we learned a lot of stuff in the course of our 74 00:03:47,670 --> 00:03:49,670 development that we don't use a cloud of 75 00:03:49,670 --> 00:03:52,650 computers to figure out. 76 00:03:52,650 --> 00:03:59,250 We learned how to associate the gestures of our mouth with 77 00:03:59,250 --> 00:04:02,160 the sounds that we make, for example. 78 00:04:02,160 --> 00:04:05,200 So I want to spend a minute or two talking about some work 79 00:04:05,200 --> 00:04:08,790 that someday will be the subject of a couple of 80 00:04:08,790 --> 00:04:10,586 lectures, I think. 81 00:04:10,586 --> 00:04:18,250 But it's the question of how to use multiple modalities and 82 00:04:18,250 --> 00:04:23,720 correspondences between them to sort out both of the 83 00:04:23,720 --> 00:04:25,820 contributing modalities. 84 00:04:25,820 --> 00:04:26,910 That sounds contradictory. 85 00:04:26,910 --> 00:04:28,160 Let me show you an example. 86 00:04:31,940 --> 00:04:33,780 This is the example, a zebra finch. 87 00:04:33,780 --> 00:04:39,090 And it's showing you the result of a program written by 88 00:04:39,090 --> 00:04:40,700 Michael Coen, now a professor at the 89 00:04:40,700 --> 00:04:43,409 University of Wisconsin. 90 00:04:43,409 --> 00:04:49,420 So the male zebra finch learns to sing a nice mating song 91 00:04:49,420 --> 00:04:51,159 from its daddy. 92 00:04:51,159 --> 00:04:54,240 And this is what one such zebra finch sounds like. 93 00:04:54,240 --> 00:04:56,690 [BIRD SONG PLAYING] 94 00:04:56,690 --> 00:04:59,100 Nice, don't you think? 95 00:04:59,100 --> 00:05:02,620 And here's what was learned by a program that uses no 96 00:05:02,620 --> 00:05:05,150 probabilistic stuff at all, but rather the notion of 97 00:05:05,150 --> 00:05:07,528 cross-modal coupling. 98 00:05:07,528 --> 00:05:09,472 [BIRD SONG PLAYING] 99 00:05:09,472 --> 00:05:12,700 Can you tell the difference? 100 00:05:12,700 --> 00:05:15,970 It's not known if this particular song turns on the 101 00:05:15,970 --> 00:05:17,775 female zebra finch. 102 00:05:17,775 --> 00:05:21,750 But to the untrained human ear, they sure 103 00:05:21,750 --> 00:05:24,550 sound a whole lot alike. 104 00:05:24,550 --> 00:05:26,170 So how does that work? 105 00:05:26,170 --> 00:05:27,420 Here's how that works. 106 00:05:32,720 --> 00:05:34,280 Well, I'm not going to show you how that works. 107 00:05:34,280 --> 00:05:37,350 What I'm going to show you is how the classroom example 108 00:05:37,350 --> 00:05:39,780 works, the first chapter example in 109 00:05:39,780 --> 00:05:41,930 Coen's Ph.D. Thesis. 110 00:05:41,930 --> 00:05:43,070 Here's what happens. 111 00:05:43,070 --> 00:05:46,740 When we talk, we produce a Fourier transform that moves 112 00:05:46,740 --> 00:05:48,250 along with our speech. 113 00:05:48,250 --> 00:05:52,250 And if we say a vowel like aah, you get a fairly constant 114 00:05:52,250 --> 00:05:53,930 Fourier spectrum. 115 00:05:53,930 --> 00:05:55,800 And you can say, well, where are the peaks in 116 00:05:55,800 --> 00:05:57,080 that Fourier spectrum? 117 00:05:57,080 --> 00:06:00,260 And how do they correspond to the appearance of my mouth 118 00:06:00,260 --> 00:06:02,020 when I say the vowel? 119 00:06:02,020 --> 00:06:03,970 So here's how that works. 120 00:06:08,980 --> 00:06:14,130 So here's the Fourier spectrum of a particular vowel. 121 00:06:14,130 --> 00:06:16,530 And when you smooth that, those 122 00:06:16,530 --> 00:06:18,530 peaks are called formants. 123 00:06:18,530 --> 00:06:20,710 And so we're just going to keep track of the first and 124 00:06:20,710 --> 00:06:22,270 second formant. 125 00:06:22,270 --> 00:06:25,660 But when I say those things, I also can form an ellipse 126 00:06:25,660 --> 00:06:28,540 around my mouth when I say them. 127 00:06:28,540 --> 00:06:30,560 And when I form an ellipse around my mouth when I say 128 00:06:30,560 --> 00:06:35,100 them, that gives me this second modality. 129 00:06:35,100 --> 00:06:39,010 So the question is, is there a way of associating the 130 00:06:39,010 --> 00:06:42,690 gestures that produce the sound with the sound itself? 131 00:06:42,690 --> 00:06:46,600 Well, there's the human data conveniently provided by a 132 00:06:46,600 --> 00:06:50,480 variety of sources, including Michael Coen's wife who 133 00:06:50,480 --> 00:06:53,300 produced the lip contour data on the right. 134 00:06:53,300 --> 00:06:57,150 So that's all marked up and color coded according to the 135 00:06:57,150 --> 00:06:58,320 particular vowels in English. 136 00:06:58,320 --> 00:07:01,040 I guess there are ten of them. 137 00:07:01,040 --> 00:07:03,120 So we humans all learn that. 138 00:07:03,120 --> 00:07:03,610 But guess what? 139 00:07:03,610 --> 00:07:05,460 We don't learn it from this. 140 00:07:05,460 --> 00:07:09,170 Because we don't get to work with any marked up data. 141 00:07:09,170 --> 00:07:11,360 We learn it from that. 142 00:07:11,360 --> 00:07:15,710 Somehow we're exposed to the natural world, and we dig the 143 00:07:15,710 --> 00:07:17,190 vowel sounds out. 144 00:07:17,190 --> 00:07:20,890 It's fantastic how we do that. 145 00:07:20,890 --> 00:07:23,405 But we do have cross modal coupling data and maybe that's 146 00:07:23,405 --> 00:07:25,280 got something to do with it. 147 00:07:25,280 --> 00:07:28,345 So here is a particular cluster of sounds. 148 00:07:28,345 --> 00:07:30,680 And what I want to know is, can I merge any of these two 149 00:07:30,680 --> 00:07:33,260 clusters to form a bigger cluster with a 150 00:07:33,260 --> 00:07:35,159 corresponding meaning? 151 00:07:35,159 --> 00:07:39,720 So what I can do is, I can say, well, I can watch these. 152 00:07:39,720 --> 00:07:42,740 I know what the lip form is when a 153 00:07:42,740 --> 00:07:44,780 particular sound is made. 154 00:07:44,780 --> 00:07:46,770 So I have these correspondences. 155 00:07:46,770 --> 00:07:50,780 So maybe there are four of those, like so. 156 00:07:50,780 --> 00:07:53,980 And maybe this same guy projects a couple 157 00:07:53,980 --> 00:07:58,210 of tones into that. 158 00:07:58,210 --> 00:08:03,276 And the question is, can this guy be combined with any of 159 00:08:03,276 --> 00:08:04,190 these guys? 160 00:08:04,190 --> 00:08:05,510 And the answer is yes. 161 00:08:05,510 --> 00:08:08,880 If they're close together on one side, maybe that suggests 162 00:08:08,880 --> 00:08:11,090 you ought to cluster them on the other side. 163 00:08:11,090 --> 00:08:14,830 But there's a question about what close means. 164 00:08:14,830 --> 00:08:19,760 So let's suppose that we also look at how these guys, these 165 00:08:19,760 --> 00:08:20,555 other two guys, project. 166 00:08:20,555 --> 00:08:24,310 And suppose this guy projects twice up here 167 00:08:24,310 --> 00:08:27,280 and once over here. 168 00:08:27,280 --> 00:08:30,230 And this guy down here just projects like 169 00:08:30,230 --> 00:08:33,380 crazy into that guy. 170 00:08:33,380 --> 00:08:36,320 Which are closer? 171 00:08:36,320 --> 00:08:40,510 These two or these two? 172 00:08:40,510 --> 00:08:42,190 Well, my diagram's getting a little closer. 173 00:08:42,190 --> 00:08:45,380 But if you paid attention when I was drawing it, you would 174 00:08:45,380 --> 00:08:50,570 see that this guy projects in proportion to that guy. 175 00:08:50,570 --> 00:08:55,030 So if we look at the-- if we take each of these projections 176 00:08:55,030 --> 00:08:59,440 as the components of a vector, then those two vectors are in 177 00:08:59,440 --> 00:09:01,380 the same direction. 178 00:09:01,380 --> 00:09:04,020 And the cosine between them is zero. 179 00:09:04,020 --> 00:09:08,950 So these are the two that are closest together from that 180 00:09:08,950 --> 00:09:11,460 sort of perspective of that kind of metric. 181 00:09:11,460 --> 00:09:15,106 And those guys are the ones who get combined. 182 00:09:15,106 --> 00:09:16,793 Would you like to see a demonstration? 183 00:09:16,793 --> 00:09:17,780 Yeah. 184 00:09:17,780 --> 00:09:20,630 OK, here's a demonstration based on Coen's work. 185 00:09:29,960 --> 00:09:32,350 So here we have two sides. 186 00:09:32,350 --> 00:09:35,572 We could think of one side as being vowel sounds and the 187 00:09:35,572 --> 00:09:38,930 other side as being lip contours or something. 188 00:09:38,930 --> 00:09:42,770 But you don't see anything in the diagram so far about how 189 00:09:42,770 --> 00:09:45,490 these things ought to be sorted out into groups. 190 00:09:45,490 --> 00:09:49,080 So if I just take one step, why, it discovers that those 191 00:09:49,080 --> 00:09:51,760 two guys had the same projection 192 00:09:51,760 --> 00:09:55,700 pattern as each other. 193 00:09:55,700 --> 00:09:58,860 So if I take another step and do the same thing on the other 194 00:09:58,860 --> 00:10:05,130 side, and now in the third step, the two areas that were 195 00:10:05,130 --> 00:10:07,920 formerly combined now form a super area. 196 00:10:07,920 --> 00:10:11,260 And they're seen to project in the same way as the blue area. 197 00:10:11,260 --> 00:10:14,640 So using this kind of projection idea, I can 198 00:10:14,640 --> 00:10:17,480 gradually build up an understanding of how those 199 00:10:17,480 --> 00:10:18,730 regions go together. 200 00:10:23,310 --> 00:10:26,710 And I discover, in this contrived example, that 201 00:10:26,710 --> 00:10:29,290 there's a vertical arrangement on the right side that 202 00:10:29,290 --> 00:10:32,692 corresponds to a horizontal arrangement on the left side. 203 00:10:32,692 --> 00:10:34,670 Now, you say to me, I'd like to see something a little bit 204 00:10:34,670 --> 00:10:37,940 more like the lip contour data. 205 00:10:37,940 --> 00:10:39,890 I'm just stepping through here until I get 206 00:10:39,890 --> 00:10:42,400 something I kind of like. 207 00:10:42,400 --> 00:10:44,020 Oh, that sounds good. 208 00:10:44,020 --> 00:10:46,540 That seems good. 209 00:10:46,540 --> 00:10:48,420 So there's a correspondence here. 210 00:10:48,420 --> 00:10:53,730 This is all made up data, Gaussians of various shapes 211 00:10:53,730 --> 00:10:55,920 and orientations. 212 00:10:55,920 --> 00:10:58,210 Let's see what happens when I run the clustering 213 00:10:58,210 --> 00:10:59,460 algorithm on that. 214 00:11:05,300 --> 00:11:07,860 Something definite was learned at every step. 215 00:11:07,860 --> 00:11:11,290 We find the correspondence between the pink region on the 216 00:11:11,290 --> 00:11:13,320 right and the pink region on the left. 217 00:11:13,320 --> 00:11:15,370 In some cases, where the regions are rather blurred 218 00:11:15,370 --> 00:11:17,730 together, the other side is the one that helps the system 219 00:11:17,730 --> 00:11:21,060 figure out how things are organized. 220 00:11:21,060 --> 00:11:23,390 So I cite this as an example of something I 221 00:11:23,390 --> 00:11:24,470 think is very important. 222 00:11:24,470 --> 00:11:29,230 Number one, it's possible to discover regularity without 223 00:11:29,230 --> 00:11:32,440 being obsessively concerned with Bayesian probability. 224 00:11:32,440 --> 00:11:36,940 And also, that there's very likely a whole lot of this 225 00:11:36,940 --> 00:11:40,480 going on in human intelligence. 226 00:11:40,480 --> 00:11:45,630 When we emerge and begin to examine and explore the world 227 00:11:45,630 --> 00:11:48,520 around us, we're presented with a lot of unlabeled data 228 00:11:48,520 --> 00:11:50,460 that we've got to make sense of. 229 00:11:50,460 --> 00:11:54,540 And I believe that this kind of cross modal coupling idea 230 00:11:54,540 --> 00:12:01,340 is very likely to be bound up in our understanding of that 231 00:12:01,340 --> 00:12:03,000 world that's presented to us. 232 00:12:03,000 --> 00:12:04,340 It's fast, it's direct. 233 00:12:04,340 --> 00:12:06,465 It doesn't take thousands of data points. 234 00:12:06,465 --> 00:12:08,700 It just happens. 235 00:12:08,700 --> 00:12:10,450 And it happens effortlessly. 236 00:12:10,450 --> 00:12:13,434 And if this isn't built in-- 237 00:12:13,434 --> 00:12:17,700 if this isn't determined to be built in, you can come back to 238 00:12:17,700 --> 00:12:20,470 MIT in 15 years and put me and jail. 239 00:12:20,470 --> 00:12:23,900 Because I think this is really the way it works. 240 00:12:23,900 --> 00:12:24,950 So there it is. 241 00:12:24,950 --> 00:12:27,800 There's a couple of things to have wrapped up. 242 00:12:27,800 --> 00:12:34,650 And now the next thing I want to do for the rest of our last 243 00:12:34,650 --> 00:12:37,780 time together in this format is talk to you about 244 00:12:37,780 --> 00:12:39,200 a variety of things. 245 00:12:39,200 --> 00:12:41,690 And I'll depart from my usual practice and 246 00:12:41,690 --> 00:12:43,380 move to some slides. 247 00:12:43,380 --> 00:12:46,020 So Dave, could we have the center screen, please? 248 00:12:56,970 --> 00:13:00,900 So first, a brief review of where we've been and where 249 00:13:00,900 --> 00:13:03,090 we've come. 250 00:13:03,090 --> 00:13:05,600 I think in the very first class, I talked about what 251 00:13:05,600 --> 00:13:07,570 artificial intelligence was. 252 00:13:07,570 --> 00:13:09,640 And I talked about how you could view it from either an 253 00:13:09,640 --> 00:13:12,620 engineering perspective or a scientific perspective. 254 00:13:12,620 --> 00:13:14,930 I'm on the scientific perspective side. 255 00:13:14,930 --> 00:13:18,730 And I think, nothing against applications, but I think 256 00:13:18,730 --> 00:13:22,820 we'll be able to make much more sophisticated and 257 00:13:22,820 --> 00:13:25,970 wonderful applications if we have not only the engineering 258 00:13:25,970 --> 00:13:29,450 perspective about building stuff but also the scientific 259 00:13:29,450 --> 00:13:31,250 perspective about understanding the stuff to 260 00:13:31,250 --> 00:13:34,230 begin with. 261 00:13:34,230 --> 00:13:36,790 So both perspectives are important. 262 00:13:36,790 --> 00:13:40,590 And in this case you can see that they all involve 263 00:13:40,590 --> 00:13:42,770 representations, methods, and architectures. 264 00:13:42,770 --> 00:13:44,850 Dave, I've changed my mind. 265 00:13:44,850 --> 00:13:46,440 Could you also give me the side screen so 266 00:13:46,440 --> 00:13:47,690 I can see it too? 267 00:13:53,320 --> 00:13:55,120 So, that's that. 268 00:13:55,120 --> 00:13:55,750 What's next? 269 00:13:55,750 --> 00:13:57,520 The business perspective, which we talked about on 270 00:13:57,520 --> 00:13:58,120 Thanksgiving. 271 00:13:58,120 --> 00:14:02,950 The important idea being that the knee-jerk expectation that 272 00:14:02,950 --> 00:14:05,300 the commercial value of something is in replacing 273 00:14:05,300 --> 00:14:10,850 people is something that is not sensible in the first 274 00:14:10,850 --> 00:14:14,250 instance, and demonstrated to be unlikely and untrue in the 275 00:14:14,250 --> 00:14:15,550 second instance. 276 00:14:15,550 --> 00:14:17,380 The thing that turns people on from the point of view of 277 00:14:17,380 --> 00:14:20,790 applications is not replacing people but making new revenue, 278 00:14:20,790 --> 00:14:22,760 making new capability. 279 00:14:22,760 --> 00:14:25,570 And that at once licenses you to not have something done 280 00:14:25,570 --> 00:14:28,620 exclusively by a computer but something that can be done in 281 00:14:28,620 --> 00:14:30,506 partnership with a person. 282 00:14:30,506 --> 00:14:33,950 So all the important applications of artificial 283 00:14:33,950 --> 00:14:38,210 intelligence involve people and computers working in 284 00:14:38,210 --> 00:14:41,740 tandem, with each doing what they do best-- 285 00:14:41,740 --> 00:14:45,170 not with replacing people. 286 00:14:45,170 --> 00:14:47,410 So that's that. 287 00:14:47,410 --> 00:14:50,550 Here's what AI does that makes AI different from the rest of 288 00:14:50,550 --> 00:14:54,370 the fields that attempt to contribute to an understanding 289 00:14:54,370 --> 00:14:56,570 of intelligence. 290 00:14:56,570 --> 00:14:59,010 Now, we have the benefit of having a language for 291 00:14:59,010 --> 00:15:01,430 procedures. 292 00:15:01,430 --> 00:15:07,790 We have all the metaphors that we are the custodians of in 293 00:15:07,790 --> 00:15:10,926 consequences of knowing about programming. 294 00:15:10,926 --> 00:15:14,410 We have the metaphor of garbage collection. 295 00:15:14,410 --> 00:15:16,770 We can talk about all sorts of things with programming 296 00:15:16,770 --> 00:15:22,430 metaphors that are unavailable to people in other fields that 297 00:15:22,430 --> 00:15:25,360 are interested in psychology. 298 00:15:25,360 --> 00:15:28,450 We have a way to make models because we can write programs. 299 00:15:28,450 --> 00:15:30,280 And when we write programs, there's no question of 300 00:15:30,280 --> 00:15:31,510 sweeping things under the rug. 301 00:15:31,510 --> 00:15:34,690 We have to work out the details. 302 00:15:34,690 --> 00:15:36,720 And once we've done all that, then we have opportunities to 303 00:15:36,720 --> 00:15:42,190 experiment that are beyond the ability to experiment in most 304 00:15:42,190 --> 00:15:43,580 other fields. 305 00:15:43,580 --> 00:15:45,580 Oh, magnificent experiments are done these days in 306 00:15:45,580 --> 00:15:49,010 developmental psychology and all the rest of-- 307 00:15:49,010 --> 00:15:52,020 all the other branches of psychology, including MRI 308 00:15:52,020 --> 00:15:54,130 studies that probe into your brain and see how it's 309 00:15:54,130 --> 00:15:57,010 consuming sugar. 310 00:15:57,010 --> 00:15:59,920 But it's very difficult to ablate or take away some piece 311 00:15:59,920 --> 00:16:03,840 of your knowledge and see how you work without it. 312 00:16:03,840 --> 00:16:06,600 I can take the line-drawing program that we talked about 313 00:16:06,600 --> 00:16:08,820 and say, how will it do if it doesn't know anything about 314 00:16:08,820 --> 00:16:10,600 fork junctions? 315 00:16:10,600 --> 00:16:13,180 And we can determine an answer. 316 00:16:13,180 --> 00:16:15,620 But I can't reach into Sebastian's head here with a 317 00:16:15,620 --> 00:16:18,350 surgical procedure and take out his 318 00:16:18,350 --> 00:16:20,525 knowledge of fork junctions. 319 00:16:20,525 --> 00:16:23,270 It just can't be done. 320 00:16:23,270 --> 00:16:25,730 And finally, another reason why we're different is because 321 00:16:25,730 --> 00:16:28,460 we can put upper bounds on how much knowledge is needed in 322 00:16:28,460 --> 00:16:31,530 order to perform a certain kind of task. 323 00:16:31,530 --> 00:16:34,660 These days, with bulldozer computing, the question most 324 00:16:34,660 --> 00:16:39,130 often asked is, how can you get billions of the things off 325 00:16:39,130 --> 00:16:41,810 the web and use them. 326 00:16:41,810 --> 00:16:42,280 We-- 327 00:16:42,280 --> 00:16:45,330 I especially-- sometimes ask the opposite question, which 328 00:16:45,330 --> 00:16:49,310 is how little knowledge can you have and still 329 00:16:49,310 --> 00:16:50,550 understand a story? 330 00:16:50,550 --> 00:16:51,890 That's what's interesting to me. 331 00:16:56,630 --> 00:16:59,810 So there's a methodological slide that talks to the 332 00:16:59,810 --> 00:17:02,910 question of how you do artificial intelligence in 333 00:17:02,910 --> 00:17:06,890 particular and, I suppose, science in general, 334 00:17:06,890 --> 00:17:08,770 engineering in general. 335 00:17:08,770 --> 00:17:11,329 There's a great tendency in this field to fall in love 336 00:17:11,329 --> 00:17:12,520 with particular methods. 337 00:17:12,520 --> 00:17:14,964 And we've had people who've devoted entire careers to 338 00:17:14,964 --> 00:17:18,520 neural nets, genetic algorithms, Bayesian 339 00:17:18,520 --> 00:17:20,829 probability. 340 00:17:20,829 --> 00:17:23,480 And that's mechanism envy. 341 00:17:23,480 --> 00:17:26,579 And a better way, in my judgment, is to say, what is 342 00:17:26,579 --> 00:17:27,170 the problem? 343 00:17:27,170 --> 00:17:28,970 Scientific method-- what's the problem? 344 00:17:28,970 --> 00:17:31,070 And then bring the right machinery to bear on the 345 00:17:31,070 --> 00:17:33,610 problem, rather than looking for things to do with a 346 00:17:33,610 --> 00:17:36,580 particular kind of machinery. 347 00:17:36,580 --> 00:17:44,080 So this is the methodology that first articulated in a 348 00:17:44,080 --> 00:17:46,600 forceful way, by David Marr. 349 00:17:46,600 --> 00:17:48,120 You want to start with the competence you're trying to 350 00:17:48,120 --> 00:17:52,770 understand, then bring a representation to bear on it, 351 00:17:52,770 --> 00:17:54,770 a representation that exposes the constraints and 352 00:17:54,770 --> 00:17:55,470 regularities. 353 00:17:55,470 --> 00:17:57,230 Because without those, you can't make models. 354 00:17:57,230 --> 00:18:01,540 And without those models you can't understand it, explain 355 00:18:01,540 --> 00:18:04,010 it, predict it, or control it. 356 00:18:04,010 --> 00:18:08,540 So it seems to make sense from a kind of MIT, model-centered 357 00:18:08,540 --> 00:18:10,210 point of view. 358 00:18:10,210 --> 00:18:11,980 And only when you've got all that straight do you start 359 00:18:11,980 --> 00:18:14,700 working on your methods and implement an experiment and 360 00:18:14,700 --> 00:18:17,520 then go around that loop. 361 00:18:20,330 --> 00:18:24,320 So that's all I want to say by way of review, I suppose. 362 00:18:24,320 --> 00:18:28,270 I want to take a minute or two and just remind you of what's 363 00:18:28,270 --> 00:18:28,880 on the final. 364 00:18:28,880 --> 00:18:31,110 And there's nothing to remind because you all know what's on 365 00:18:31,110 --> 00:18:32,565 the final already. 366 00:18:32,565 --> 00:18:37,040 We'll have four sections corresponding with four exams. 367 00:18:37,040 --> 00:18:39,830 Then we'll have a fifth and final question that will be 368 00:18:39,830 --> 00:18:41,750 everything else. 369 00:18:41,750 --> 00:18:44,760 All that stuff you slept through will be featured 370 00:18:44,760 --> 00:18:48,740 there, as well a little problem on Bayesian inference. 371 00:18:51,310 --> 00:18:53,950 We rearranged the subject, mostly so I could write the 372 00:18:53,950 --> 00:18:56,400 demonstrations. 373 00:18:56,400 --> 00:18:58,770 So that the Bayesian stuff didn't come 374 00:18:58,770 --> 00:19:00,100 before the fourth quiz. 375 00:19:00,100 --> 00:19:03,240 Therefore the Bayesian stuff that you see on those previous 376 00:19:03,240 --> 00:19:06,500 quizzes is likely to be harder than the stuff that we'll ask 377 00:19:06,500 --> 00:19:08,680 on the final, because you haven't had as much experience 378 00:19:08,680 --> 00:19:11,370 with it as people did last year. 379 00:19:11,370 --> 00:19:13,280 I've got a few icons on there to remind me to 380 00:19:13,280 --> 00:19:15,200 tell you a few things. 381 00:19:15,200 --> 00:19:19,130 As always, open everything except for computers. 382 00:19:19,130 --> 00:19:21,680 You can wear a costume. 383 00:19:21,680 --> 00:19:24,670 You can do anything you like as long as it doesn't disturb 384 00:19:24,670 --> 00:19:28,650 your neighbor, within reason. 385 00:19:28,650 --> 00:19:30,650 Well, I guess if it doesn't disturb neighbor, 386 00:19:30,650 --> 00:19:31,680 it is within reason. 387 00:19:31,680 --> 00:19:33,315 So maybe that's all I need to say. 388 00:19:36,270 --> 00:19:37,730 I'm not sure where we're going to be. 389 00:19:37,730 --> 00:19:40,920 But it's certainly the case that, historically, there are 390 00:19:40,920 --> 00:19:42,960 no visible clocks. 391 00:19:42,960 --> 00:19:46,050 So, we soon run out of all of our cellphones, wrist watches, 392 00:19:46,050 --> 00:19:48,450 and other time pieces as we hand them out. 393 00:19:48,450 --> 00:19:52,430 So it pays to remember to bring some kind of timepiece, 394 00:19:52,430 --> 00:19:56,120 because we won't be able to convey the time very well. 395 00:19:56,120 --> 00:19:59,050 And finally, I see a little calculator there. 396 00:19:59,050 --> 00:20:01,700 I don't recall any exam where you actually needed a 397 00:20:01,700 --> 00:20:02,300 calculator. 398 00:20:02,300 --> 00:20:05,270 But it's sort of a security blanket to have one. 399 00:20:05,270 --> 00:20:07,510 People sometimes see a problem and say, oh my God, what am I 400 00:20:07,510 --> 00:20:07,960 going to do? 401 00:20:07,960 --> 00:20:10,450 I left my calculator at home. 402 00:20:10,450 --> 00:20:13,990 So as to avoid that anxiety, you might want to bring one 403 00:20:13,990 --> 00:20:17,070 even though you won't need it. 404 00:20:17,070 --> 00:20:18,590 So that's the final. 405 00:20:18,590 --> 00:20:20,650 I'm sure there are no questions. 406 00:20:20,650 --> 00:20:21,100 Are there? 407 00:20:21,100 --> 00:20:22,900 It's obvious. 408 00:20:22,900 --> 00:20:25,420 Everybody will do well. 409 00:20:25,420 --> 00:20:29,480 Two shots, that whole thing. 410 00:20:29,480 --> 00:20:30,790 Now, what to do next? 411 00:20:30,790 --> 00:20:33,310 Suppose this subject has turned you on. 412 00:20:33,310 --> 00:20:35,310 There are a variety of things that you should be thinking 413 00:20:35,310 --> 00:20:36,345 about doing next semester. 414 00:20:36,345 --> 00:20:39,790 And I wanted to review just a few of those. 415 00:20:39,790 --> 00:20:44,770 One of them is Marvin Minsky's subject, Society of Mind. 416 00:20:44,770 --> 00:20:47,920 It's very different from this class. 417 00:20:47,920 --> 00:20:51,450 There are no prepared lectures. 418 00:20:51,450 --> 00:20:53,690 Marvin doesn't rehearse. 419 00:20:53,690 --> 00:20:55,430 He doesn't think about what he's going to say in advance. 420 00:20:57,950 --> 00:21:00,530 It's like this except it's just a 421 00:21:00,530 --> 00:21:03,000 conversation with Marvin. 422 00:21:03,000 --> 00:21:07,430 So many people find themselves bored stiff for two lectures 423 00:21:07,430 --> 00:21:08,650 out of three. 424 00:21:08,650 --> 00:21:12,720 But then in the third lecture, Marvin will say something that 425 00:21:12,720 --> 00:21:15,890 you'll think about for a year or for the rest of your life. 426 00:21:15,890 --> 00:21:17,130 That's what happens to me. 427 00:21:17,130 --> 00:21:18,870 I'm bored stiff two out three times. 428 00:21:18,870 --> 00:21:21,560 And then the third lecture he says something, and I think 429 00:21:21,560 --> 00:21:25,510 about it for at least a year and maybe permanently. 430 00:21:25,510 --> 00:21:30,500 So it's an opportunity to see one of MIT's true geniuses 431 00:21:30,500 --> 00:21:32,300 think out loud. 432 00:21:32,300 --> 00:21:34,130 So it's an experience that you don't want to miss because 433 00:21:34,130 --> 00:21:36,470 that's what you come here for, is to see the 434 00:21:36,470 --> 00:21:39,970 geniuses think out loud. 435 00:21:39,970 --> 00:21:42,530 Speaking of geniuses, then there's Bob Berwick. 436 00:21:42,530 --> 00:21:47,310 And he heroically is doing two subjects in the spring. 437 00:21:47,310 --> 00:21:49,960 Both of which I'd take if I could. 438 00:21:49,960 --> 00:21:55,090 One is his subject on Language Understanding. 439 00:21:55,090 --> 00:21:56,920 And the reason I'd take that is because I believe that 440 00:21:56,920 --> 00:21:59,520 language is at the center of any explanation of our 441 00:21:59,520 --> 00:22:01,100 intelligence. 442 00:22:01,100 --> 00:22:04,330 So that's the subject I would be, I suppose, most inclined 443 00:22:04,330 --> 00:22:06,850 to take if I were you. 444 00:22:06,850 --> 00:22:08,460 Well, maybe Minsky's. 445 00:22:08,460 --> 00:22:09,330 It's hard to say. 446 00:22:09,330 --> 00:22:13,450 And incidentally, very heroically, Bob is also 447 00:22:13,450 --> 00:22:15,800 teaching a course on how evolution works, 448 00:22:15,800 --> 00:22:17,440 how it really works-- 449 00:22:17,440 --> 00:22:19,860 in so far as we know how it really works-- 450 00:22:19,860 --> 00:22:20,380 as well. 451 00:22:20,380 --> 00:22:23,130 So both of those will be offered in the spring. 452 00:22:23,130 --> 00:22:25,630 I don't know how he does it. 453 00:22:25,630 --> 00:22:28,720 I don't know how he does two all at the same time. 454 00:22:28,720 --> 00:22:30,200 Of course there are lots of places where you can go to 455 00:22:30,200 --> 00:22:32,770 school, and the faculty will be teaching five courses at 456 00:22:32,770 --> 00:22:33,800 the same time. 457 00:22:33,800 --> 00:22:35,260 I just think they're crazy or something. 458 00:22:35,260 --> 00:22:38,110 I don't know how that works. 459 00:22:38,110 --> 00:22:43,290 Gerry Sussman will be teaching his Large Scale Symbolic 460 00:22:43,290 --> 00:22:44,890 System subject. 461 00:22:44,890 --> 00:22:46,840 That's sometimes-- 462 00:22:46,840 --> 00:22:52,600 oh, I forgot what he wasn't able to call it, something 463 00:22:52,600 --> 00:22:55,340 that wasn't politically correct about programming for 464 00:22:55,340 --> 00:22:59,700 people who really, really like to program. 465 00:22:59,700 --> 00:23:02,700 It's a splendid course on how to build really big systems. 466 00:23:02,700 --> 00:23:06,250 And we use the ideas in that subject in our research 467 00:23:06,250 --> 00:23:07,970 system, because it's the only way-- 468 00:23:07,970 --> 00:23:10,010 understanding how that works is the only way that you can 469 00:23:10,010 --> 00:23:12,845 build systems that are too big to be built. 470 00:23:12,845 --> 00:23:16,050 I may say a word about that a little later. 471 00:23:16,050 --> 00:23:21,030 So those are my favorite three/four picks. 472 00:23:21,030 --> 00:23:22,450 But there's lots of other stuff, too 473 00:23:22,450 --> 00:23:24,090 many things to cover-- 474 00:23:24,090 --> 00:23:25,700 the media lab, [INAUDIBLE] 475 00:23:25,700 --> 00:23:27,020 psychology. 476 00:23:27,020 --> 00:23:29,070 There's tons of stuff out there. 477 00:23:29,070 --> 00:23:32,550 And I would only mention that courses that have those three 478 00:23:32,550 --> 00:23:34,880 names on them are bound to be good. 479 00:23:34,880 --> 00:23:37,800 These are colleagues that I think I have a important 480 00:23:37,800 --> 00:23:38,860 perspective-- 481 00:23:38,860 --> 00:23:40,830 not necessarily one I agree with, but an important 482 00:23:40,830 --> 00:23:43,000 perspective that you should understand-- 483 00:23:43,000 --> 00:23:46,300 Richards, Tenenbaum, and Sinha. 484 00:23:46,300 --> 00:23:48,590 And now we come to my spring course, the Human Intelligence 485 00:23:48,590 --> 00:23:51,000 Enterprise. 486 00:23:51,000 --> 00:23:54,190 It's 6.XXX not because there's anything pornographic about it 487 00:23:54,190 --> 00:23:58,580 but because for a long time, I couldn't remember it's number. 488 00:23:58,580 --> 00:24:01,610 So I developed a habit of referring to it as 6.XXX and 489 00:24:01,610 --> 00:24:02,860 it seems to have stuck. 490 00:24:05,630 --> 00:24:06,880 Here's what that's about. 491 00:24:13,310 --> 00:24:14,320 Yeah, that might be interesting. 492 00:24:14,320 --> 00:24:16,330 It's taught like a humanities course, though. 493 00:24:16,330 --> 00:24:20,320 No lectures, I just talk. 494 00:24:20,320 --> 00:24:23,975 And all the TAs are veterans of that class so if you want 495 00:24:23,975 --> 00:24:28,030 to know if you should do it, you have several resources. 496 00:24:28,030 --> 00:24:31,390 You can talk to them. 497 00:24:31,390 --> 00:24:37,100 Or you could look at the sorts of things we talk about. 498 00:24:37,100 --> 00:24:38,970 Here are some things we talk about by way of packaging. 499 00:24:45,516 --> 00:24:47,940 Yeah, I can hear a little tittering there because people 500 00:24:47,940 --> 00:24:49,810 have discovered the last element. 501 00:24:49,810 --> 00:24:51,750 Some people take the whole subject because they want to 502 00:24:51,750 --> 00:24:55,460 be present for that unit. 503 00:24:55,460 --> 00:24:57,280 And we talk about all those kinds of things. 504 00:24:57,280 --> 00:25:02,370 And we look to see what the common elements are in all 505 00:25:02,370 --> 00:25:05,335 those kinds of packaging problems of the sort that you 506 00:25:05,335 --> 00:25:09,290 will face over and over again when you become an adult, no 507 00:25:09,290 --> 00:25:11,470 matter what you do. 508 00:25:11,470 --> 00:25:16,010 If you become a business person, an entrepreneur, a 509 00:25:16,010 --> 00:25:21,360 military officer, a scientist, or an engineer, that packaging 510 00:25:21,360 --> 00:25:24,280 stuff will often make the difference between whether you 511 00:25:24,280 --> 00:25:25,530 succeed or don't. 512 00:25:28,560 --> 00:25:30,750 And then, that's the second way you can figure out whether 513 00:25:30,750 --> 00:25:32,480 you want to take the subject. 514 00:25:32,480 --> 00:25:36,330 The content, the TAs, and then of course you can always 515 00:25:36,330 --> 00:25:39,720 appeal to the Underground Guide. 516 00:25:39,720 --> 00:25:45,080 And that's why it's very rare for someone that takes 6.XXX 517 00:25:45,080 --> 00:25:48,290 who hasn't been at this final lecture because they read the 518 00:25:48,290 --> 00:25:51,090 Underground Guide. 519 00:25:51,090 --> 00:25:54,370 Here is an element that appeared in the Underground 520 00:25:54,370 --> 00:25:55,840 Guide a few years back. 521 00:26:06,250 --> 00:26:07,500 There are no exams. 522 00:26:10,870 --> 00:26:13,930 But there is a tradition of hacking the Underground Guide. 523 00:26:13,930 --> 00:26:15,910 So this is another example of something that appeared. 524 00:26:26,230 --> 00:26:28,870 So it all came about because early in the teaching of 525 00:26:28,870 --> 00:26:32,250 6.XXX, I was whining to the students about the fact that 526 00:26:32,250 --> 00:26:35,490 I've been at MIT for a long time, since I was freshman. 527 00:26:35,490 --> 00:26:39,480 And I still have yet to have any person I report to say 528 00:26:39,480 --> 00:26:41,210 anything about my teaching-- 529 00:26:41,210 --> 00:26:44,230 good, bad, indifferent. 530 00:26:44,230 --> 00:26:45,540 Nothing. 531 00:26:45,540 --> 00:26:47,010 Not a word. 532 00:26:47,010 --> 00:26:49,950 So the students decided that it would be interesting to see 533 00:26:49,950 --> 00:26:52,400 if they could say something sufficiently outrageous to 534 00:26:52,400 --> 00:26:55,140 force a conversation between the department 535 00:26:55,140 --> 00:26:57,750 chairman and me. 536 00:26:57,750 --> 00:27:00,730 And so far they've been totally unsuccessful. 537 00:27:00,730 --> 00:27:03,490 And I've tried everything. 538 00:27:03,490 --> 00:27:06,605 Winston shows up late if he shows up at all. 539 00:27:09,710 --> 00:27:11,740 Good instructor but constantly sipping from 540 00:27:11,740 --> 00:27:12,990 a brown paper bag. 541 00:27:16,900 --> 00:27:20,360 All kinds of stuff. 542 00:27:20,360 --> 00:27:21,430 But there it is. 543 00:27:21,430 --> 00:27:22,910 It's a lot of fun. 544 00:27:22,910 --> 00:27:26,220 It's a little oversubscribed so we have to have a lottery 545 00:27:26,220 --> 00:27:28,560 and there's about 50% chance and so on and so forth. 546 00:27:28,560 --> 00:27:33,410 But many of you will find it a good thing to do. 547 00:27:37,340 --> 00:27:37,780 Oh, yeah. 548 00:27:37,780 --> 00:27:42,150 And now I also want to remind myself that there is a IP 549 00:27:42,150 --> 00:27:44,690 event that's become kind of MIT tradition. 550 00:27:44,690 --> 00:27:47,030 It's the How to Speak lecture that I give. 551 00:27:47,030 --> 00:27:51,690 This year it'll be on January 28 in 6120. 552 00:27:51,690 --> 00:27:59,850 6120 holds about 120 people and about 250 show up. 553 00:27:59,850 --> 00:28:02,670 So if you want to go to that lecture you should show up 15 554 00:28:02,670 --> 00:28:03,380 minutes early. 555 00:28:03,380 --> 00:28:09,030 That's a little secret just between me and 6034 students. 556 00:28:09,030 --> 00:28:10,990 It's about packaging, too. 557 00:28:10,990 --> 00:28:14,350 It's a one-lecture version of 6.XXX. 558 00:28:14,350 --> 00:28:16,890 But it's very nonlinear because one thing that you 559 00:28:16,890 --> 00:28:20,860 pick up from that one hour may make the difference between 560 00:28:20,860 --> 00:28:24,480 you getting the job and some other slug getting the job. 561 00:28:24,480 --> 00:28:29,240 So it's one of those sorts of things that can make a big 562 00:28:29,240 --> 00:28:31,670 difference in a short period of time. 563 00:28:31,670 --> 00:28:34,240 You may sleep through 50 minutes of the 55 minutes in 564 00:28:34,240 --> 00:28:37,450 that lecture and stay awake for that one magical five 565 00:28:37,450 --> 00:28:40,250 minutes when you learn something about when to tell a 566 00:28:40,250 --> 00:28:45,870 joke or how to open a lecture or how to conclude one or a 567 00:28:45,870 --> 00:28:50,400 job talk or a sales presentation or anything. 568 00:28:50,400 --> 00:28:53,580 And that will make it worthwhile for you. 569 00:28:53,580 --> 00:28:56,440 And then of course there's the possibility of your-- 570 00:28:56,440 --> 00:28:57,530 anybody who's doing [? UROP ?] 571 00:28:57,530 --> 00:29:02,000 with me is likely to be interested in what I've 572 00:29:02,000 --> 00:29:06,600 recently come to call a strong story hypothesis, something 573 00:29:06,600 --> 00:29:08,320 that we've talked about from time to time. 574 00:29:12,470 --> 00:29:15,470 That's what makes us humans, and that's not what they are. 575 00:29:15,470 --> 00:29:18,630 They're orangutans or chimpanzees, even though the 576 00:29:18,630 --> 00:29:21,020 DNA that we share is-- 577 00:29:21,020 --> 00:29:22,580 it goes up and down. 578 00:29:22,580 --> 00:29:25,760 At one point it was 96%, then it went to 98%. 579 00:29:25,760 --> 00:29:28,610 Now I think it's back down to 97%. 580 00:29:28,610 --> 00:29:32,160 But whatever we are, it's not because of a huge, massive 581 00:29:32,160 --> 00:29:36,380 difference in DNA between us and our cousins. 582 00:29:36,380 --> 00:29:39,870 So in my group we build a Genesis 583 00:29:39,870 --> 00:29:42,300 system, modestly called. 584 00:29:42,300 --> 00:29:43,550 And that's what it looks like. 585 00:29:45,950 --> 00:29:48,490 And it has in it all the sorts of things that we've talked 586 00:29:48,490 --> 00:29:52,100 about from time to time in 6034. 587 00:29:52,100 --> 00:29:55,730 And it's about to move into areas that are especially 588 00:29:55,730 --> 00:30:00,250 interesting, like can you detect the onset of a possible 589 00:30:00,250 --> 00:30:03,110 disaster before it happens and intervene? 590 00:30:03,110 --> 00:30:06,580 Can you retrieve presses based on higher-level concepts? 591 00:30:06,580 --> 00:30:07,830 Things of that sort. 592 00:30:10,090 --> 00:30:11,340 Would you like to see a demonstration? 593 00:30:13,910 --> 00:30:14,180 OK. 594 00:30:14,180 --> 00:30:15,700 So you've see in a little bit of this before. 595 00:30:15,700 --> 00:30:19,410 In fact, I'm not even sure what exactly I've 596 00:30:19,410 --> 00:30:20,060 already shown you. 597 00:30:20,060 --> 00:30:24,396 Let me just get over there to see what goes on here. 598 00:30:24,396 --> 00:30:29,680 What I'm going to do right now is I'm just going to read 599 00:30:29,680 --> 00:30:31,222 about Macbeth. 600 00:30:31,222 --> 00:30:34,220 A short precis of the plot. 601 00:30:34,220 --> 00:30:36,060 Not the whole thing, of course. 602 00:30:36,060 --> 00:30:37,520 Just a few sentences about the plot. 603 00:30:40,630 --> 00:30:43,420 Right now what it's doing is absorbing the English and 604 00:30:43,420 --> 00:30:46,612 translating into a sort of internal language. 605 00:30:46,612 --> 00:30:50,500 A sort of universal, internal language that's all about 606 00:30:50,500 --> 00:30:53,280 trajectories and transitions and social 607 00:30:53,280 --> 00:30:55,370 relationships and all that. 608 00:30:55,370 --> 00:31:00,800 So it's being read there by two different persona. 609 00:31:00,800 --> 00:31:04,000 Each of those personas has a different educational 610 00:31:04,000 --> 00:31:05,040 background, you might say. 611 00:31:05,040 --> 00:31:08,690 They might represent different cultures. 612 00:31:08,690 --> 00:31:12,060 So eventually they build up graphs like that. 613 00:31:12,060 --> 00:31:13,790 And everything in white is stuff that's 614 00:31:13,790 --> 00:31:15,580 explicit in the story. 615 00:31:15,580 --> 00:31:19,495 And everything that's in grey is stuff that's been inferred 616 00:31:19,495 --> 00:31:21,520 by the system. 617 00:31:21,520 --> 00:31:23,220 So there are several layers of understanding. 618 00:31:23,220 --> 00:31:26,040 One is what's there explicitly. 619 00:31:26,040 --> 00:31:27,510 And the other thing that's there is stuff 620 00:31:27,510 --> 00:31:29,415 that's readily inferred. 621 00:31:29,415 --> 00:31:32,350 And because these persona have different educational 622 00:31:32,350 --> 00:31:37,681 backgrounds, you might say they see the killing of 623 00:31:37,681 --> 00:31:41,490 Macbeth at the end of the play in a different 624 00:31:41,490 --> 00:31:43,540 light from one another. 625 00:31:43,540 --> 00:31:45,870 One sees it as an act of insane violence and the other 626 00:31:45,870 --> 00:31:50,440 sees it as a consequence of a revenge. 627 00:31:50,440 --> 00:31:52,680 So once you've got that capability, you can do all 628 00:31:52,680 --> 00:31:53,240 sorts of things. 629 00:31:53,240 --> 00:31:56,470 For example, you can ask questions. 630 00:31:56,470 --> 00:31:59,080 So let me arrange it to ask-- by the way, this is a live 631 00:31:59,080 --> 00:31:59,455 demonstration. 632 00:31:59,455 --> 00:32:03,250 I'm thrilled to pieces that it actually works so far. 633 00:32:03,250 --> 00:32:04,050 Let's see. 634 00:32:04,050 --> 00:32:16,600 Why did Macbeth kill Duncan? 635 00:32:16,600 --> 00:32:17,850 You all know why, right? 636 00:32:22,430 --> 00:32:23,400 Yeah, you're right. 637 00:32:23,400 --> 00:32:24,400 He didn't actually do that. 638 00:32:24,400 --> 00:32:27,400 On a common sense level, neither Dr. Jeckll nor Mr. 639 00:32:27,400 --> 00:32:29,065 Hyde have an opinion. 640 00:32:29,065 --> 00:32:32,420 On a reflective level neither Dr. Jeckll more Mr. Hyde have 641 00:32:32,420 --> 00:32:33,000 an opinion. 642 00:32:33,000 --> 00:32:35,580 That's because it didn't happen. 643 00:32:35,580 --> 00:32:41,280 So we call these two persona Doctor Jeckll and Mr. Hyde. 644 00:32:41,280 --> 00:32:43,620 And you're ready to complain right away about the spelling 645 00:32:43,620 --> 00:32:44,990 of Jeckll, aren't you? 646 00:32:44,990 --> 00:32:49,220 Well, that's because with this spelling the speech generator 647 00:32:49,220 --> 00:32:53,550 makes it sound a little bit more like when we say Jekyll. 648 00:32:53,550 --> 00:32:56,398 But what really happened is that Macduff killed Macbeth. 649 00:32:59,820 --> 00:33:02,520 On a common sense level, it looks like Dr. Jeckll thinks 650 00:33:02,520 --> 00:33:05,730 Macduff kills Macbeth because Macduff is insane. 651 00:33:05,730 --> 00:33:08,570 It looks like Mr. Hyde thinks Macduff kills Macbeth because 652 00:33:08,570 --> 00:33:10,710 Macbeth angers Macduff. 653 00:33:10,710 --> 00:33:13,410 On a reflective level, it looks like Dr. Jeckll thinks 654 00:33:13,410 --> 00:33:17,460 Macduff kills Macbeth as part of an act of insane violence. 655 00:33:17,460 --> 00:33:20,700 It looks like Mr. Hyde thinks Macduff kills Macbeth as part 656 00:33:20,700 --> 00:33:24,180 of acts of mistake, Pyrrhic victory, and revenge. 657 00:33:24,180 --> 00:33:25,440 Isn't that cool? 658 00:33:25,440 --> 00:33:28,240 I bet you'd get an A if you had said that in eighth grade. 659 00:33:33,670 --> 00:33:37,500 But once you've got this ability to understand the 660 00:33:37,500 --> 00:33:39,380 story from multiple points of view, you begin to think of 661 00:33:39,380 --> 00:33:42,090 all kinds of wonderful things you can do. 662 00:33:42,090 --> 00:33:45,610 For example, you can have Dr. Jeckll negotiate with Mr. 663 00:33:45,610 --> 00:33:48,950 Hyde, because Dr. Jeckll will be able to understand Mr. 664 00:33:48,950 --> 00:33:53,220 Hyde's point of view and demonstrate to Mr. Hyde that 665 00:33:53,220 --> 00:33:57,960 he thinks that point of view is legitimate. 666 00:33:57,960 --> 00:34:02,650 Or, Dr. Jeckll can teach Mr. Hyde the subject matter of a 667 00:34:02,650 --> 00:34:04,740 new domain. 668 00:34:04,740 --> 00:34:10,290 Or Dr. Jeckll can watch what's happening in Mr. Hyde's mind 669 00:34:10,290 --> 00:34:13,020 and avert disaster before it happens. 670 00:34:13,020 --> 00:34:17,770 So let me just show you another situation here. 671 00:34:17,770 --> 00:34:20,630 I want to turn on the onset detector and read another 672 00:34:20,630 --> 00:34:21,219 little snippet. 673 00:34:21,219 --> 00:34:23,620 This one is about-- 674 00:34:23,620 --> 00:34:24,870 what should I do? 675 00:34:27,610 --> 00:34:29,820 We'll do the Russia and Estonia cyber war. 676 00:34:32,900 --> 00:34:35,820 It's reading background knowledge right now. 677 00:34:35,820 --> 00:34:37,850 But pretty soon, in the upper left-hand corner, as it begins 678 00:34:37,850 --> 00:34:41,250 to read the story, you'll see it spotting the onset of 679 00:34:41,250 --> 00:34:43,739 potential revenge operations or potential Pyrrhic 680 00:34:43,739 --> 00:34:49,370 victories, and show their foundations begin to emerge 681 00:34:49,370 --> 00:34:54,790 and giving the system an opportunity to intervene. 682 00:34:54,790 --> 00:34:56,540 So there you can see all the things that it 683 00:34:56,540 --> 00:34:59,690 thinks might happen. 684 00:34:59,690 --> 00:35:01,370 Not all of them do happen. 685 00:35:01,370 --> 00:35:02,773 But some of them do. 686 00:35:02,773 --> 00:35:06,820 And you'll note, incidentally, that this is another case of 687 00:35:06,820 --> 00:35:08,900 Dr. Jeckll and Mr. Hyde having different cultural 688 00:35:08,900 --> 00:35:10,440 perspectives. 689 00:35:10,440 --> 00:35:13,300 One's an ally of Russia and one's an ally of Estonia. 690 00:35:13,300 --> 00:35:15,650 One sees it as unwarranted revenge and the other sees it 691 00:35:15,650 --> 00:35:17,180 as teaching a lesson. 692 00:35:19,720 --> 00:35:21,480 So, I don't know. 693 00:35:21,480 --> 00:35:23,970 What else have we got here? 694 00:35:23,970 --> 00:35:26,540 Oh yeah, president recall. 695 00:35:26,540 --> 00:35:28,880 A long time ago, we talked about doing information 696 00:35:28,880 --> 00:35:33,210 retrieval based on vectors of keyword counts. 697 00:35:33,210 --> 00:35:34,860 That's cool but not this cool. 698 00:35:34,860 --> 00:35:37,780 This is doing it on vectors of concepts that appear in the 699 00:35:37,780 --> 00:35:41,540 stories, such as revenge, even though the word revenge 700 00:35:41,540 --> 00:35:43,160 doesn't appear anywhere. 701 00:35:43,160 --> 00:35:45,750 So because we're able to understand the story on 702 00:35:45,750 --> 00:35:48,860 multiple levels, we can use those higher levels that don't 703 00:35:48,860 --> 00:35:52,010 involve the words in the story at all to drive 704 00:35:52,010 --> 00:35:54,980 the retrieval process. 705 00:35:54,980 --> 00:35:58,510 So all that is a consequence of a variety of things, one of 706 00:35:58,510 --> 00:36:04,340 which is the specialists that translate the English into an 707 00:36:04,340 --> 00:36:06,200 internal language. 708 00:36:06,200 --> 00:36:07,360 And it's also, incidentally-- 709 00:36:07,360 --> 00:36:09,330 I mentioned it a little before-- 710 00:36:09,330 --> 00:36:13,910 it's also a consequence of our use of Gerry Sussman's 711 00:36:13,910 --> 00:36:15,740 propagator architecture. 712 00:36:15,740 --> 00:36:17,465 So a student comes into our group and says 713 00:36:17,465 --> 00:36:18,060 he wants to do something. 714 00:36:18,060 --> 00:36:21,870 And we say OK, here's how the system is organized. 715 00:36:21,870 --> 00:36:25,560 It's like a bunch of boxes that are wired together. 716 00:36:25,560 --> 00:36:28,250 So you get a box and we'll tell you what the inputs are 717 00:36:28,250 --> 00:36:29,300 going to look like. 718 00:36:29,300 --> 00:36:30,690 And we'll tell you what we want on the outputs. 719 00:36:30,690 --> 00:36:34,370 And if you don't like the inputs, just ignore them. 720 00:36:34,370 --> 00:36:35,630 And if we don't like your outputs, 721 00:36:35,630 --> 00:36:36,500 we'll just ignore that. 722 00:36:36,500 --> 00:36:39,210 So nobody can screw up anything because they have a 723 00:36:39,210 --> 00:36:43,520 very circumscribed piece of the system to work with. 724 00:36:43,520 --> 00:36:52,110 So I can say, for example, the president wanted Iraq to move 725 00:36:52,110 --> 00:36:53,640 toward democracy. 726 00:36:56,490 --> 00:36:57,120 And bingo. 727 00:36:57,120 --> 00:36:59,800 That starts a propagation through that network. 728 00:37:06,730 --> 00:37:09,802 All this would be unconvincing, in my view, if 729 00:37:09,802 --> 00:37:11,810 it weren't eventually connected to perception. 730 00:37:11,810 --> 00:37:14,270 Because if it's not eventually connected with perception, 731 00:37:14,270 --> 00:37:17,580 it's yet another system that demonstrates how smart it can 732 00:37:17,580 --> 00:37:20,420 seem to be without actually knowing anything. 733 00:37:20,420 --> 00:37:25,780 So another half of what we do is an early stage attempt to 734 00:37:25,780 --> 00:37:28,410 connect the language stuff with things that are going on 735 00:37:28,410 --> 00:37:30,720 in the world. 736 00:37:30,720 --> 00:37:32,950 So we say, imagine a jumping action. 737 00:37:32,950 --> 00:37:36,640 And there is a jumping action that's part of a test sweep 738 00:37:36,640 --> 00:37:39,240 developed by the Defense Research Projects Agency to 739 00:37:39,240 --> 00:37:43,120 drive what they call the Mind's Eye program, which was 740 00:37:43,120 --> 00:37:45,090 developed largely as a consequence of work done here 741 00:37:45,090 --> 00:37:47,770 at MIT, focused on the idea that if we're going to 742 00:37:47,770 --> 00:37:50,370 understand the nature of intelligence, we have to 743 00:37:50,370 --> 00:37:54,090 understand how language is coupled into our perceptual 744 00:37:54,090 --> 00:37:56,580 systems and how those perceptual systems can answer 745 00:37:56,580 --> 00:37:58,830 questions posed to them by the language system. 746 00:38:06,000 --> 00:38:07,710 That's a little demo of the Genesis system. 747 00:38:07,710 --> 00:38:09,740 Here are the issues that we're trying to explore. 748 00:38:09,740 --> 00:38:12,530 Nothing too serious, just the nature of what is 749 00:38:12,530 --> 00:38:14,270 extraordinarily fundamental to any 750 00:38:14,270 --> 00:38:15,650 explanation of human thinking. 751 00:38:20,860 --> 00:38:23,050 Now, all of this might turn you on. 752 00:38:23,050 --> 00:38:26,370 And you say to me, well, you're sick and tired of MIT. 753 00:38:26,370 --> 00:38:28,290 You'd like to go somewhere else for graduate school. 754 00:38:28,290 --> 00:38:31,900 So now that I've demonstrated what we do here, one of the 755 00:38:31,900 --> 00:38:34,360 many things we do here, I'll talk a little bit about other 756 00:38:34,360 --> 00:38:35,200 places you can go. 757 00:38:35,200 --> 00:38:38,230 This is a sort of MIT-centric view of the world. 758 00:38:41,450 --> 00:38:43,250 It represents all of the places you could go 759 00:38:43,250 --> 00:38:44,500 when I was a kid. 760 00:38:47,310 --> 00:38:48,490 But while I've got this particular 761 00:38:48,490 --> 00:38:49,990 diagram on here, I just-- 762 00:38:49,990 --> 00:38:51,670 sort of testing my MIT arrogance-- 763 00:38:51,670 --> 00:38:54,330 I remember a story often told by my 764 00:38:54,330 --> 00:38:56,025 colleague Peter Szolovits. 765 00:38:56,025 --> 00:38:59,770 He says that when he came to a job interview from Caltech to 766 00:38:59,770 --> 00:39:02,330 MIT, he was sitting here for three days and 767 00:39:02,330 --> 00:39:03,210 nobody spoke to him. 768 00:39:03,210 --> 00:39:06,740 So eventually he said, I've got to do something. 769 00:39:06,740 --> 00:39:09,960 He walked up to a graduate student and said, hi my name 770 00:39:09,960 --> 00:39:11,375 is Peter Szolovits. 771 00:39:11,375 --> 00:39:13,370 I'm from Caltech. 772 00:39:13,370 --> 00:39:14,940 And the graduate student said, Caltech 773 00:39:14,940 --> 00:39:16,190 sucks, and walked away. 774 00:39:19,780 --> 00:39:23,740 Anyway, we've populated all these places now that you see 775 00:39:23,740 --> 00:39:25,040 here, and more. 776 00:39:25,040 --> 00:39:26,910 This is just a list that I scratched up this morning. 777 00:39:26,910 --> 00:39:32,490 I'm sure I've forgotten many that have equal right 778 00:39:32,490 --> 00:39:34,810 to be on this list. 779 00:39:34,810 --> 00:39:38,190 But in the end, which one you go to depends on who you want 780 00:39:38,190 --> 00:39:40,270 to apprentice yourself to. 781 00:39:40,270 --> 00:39:43,100 Because a graduate school is an apprenticeship. 782 00:39:43,100 --> 00:39:45,560 And that means if you go to a place with just one person, 783 00:39:45,560 --> 00:39:47,990 it's OK if that's the person you want to apprentice 784 00:39:47,990 --> 00:39:49,210 yourself to. 785 00:39:49,210 --> 00:39:51,450 Each of these places has a different focus because they 786 00:39:51,450 --> 00:39:53,790 have different people. 787 00:39:53,790 --> 00:39:56,260 So you need to find out if there's somebody at any of 788 00:39:56,260 --> 00:39:56,900 these places. 789 00:39:56,900 --> 00:40:00,340 It doesn't matter if it's AI or some other field. 790 00:40:00,340 --> 00:40:01,930 Theoretical physic-- you've got to find out if there's 791 00:40:01,930 --> 00:40:03,570 somebody at that place you want to 792 00:40:03,570 --> 00:40:04,480 apprentice yourself to. 793 00:40:04,480 --> 00:40:07,380 So those site visits are really important. 794 00:40:07,380 --> 00:40:09,110 And I would like to also stress that when you make your 795 00:40:09,110 --> 00:40:13,310 application to graduate school, it's very different 796 00:40:13,310 --> 00:40:16,030 from applying to undergraduate school. 797 00:40:16,030 --> 00:40:18,870 Because they don't care whether their school is good 798 00:40:18,870 --> 00:40:21,650 for you at all. 799 00:40:21,650 --> 00:40:23,480 They only care about one thing-- 800 00:40:23,480 --> 00:40:26,450 whether you're good for their school. 801 00:40:26,450 --> 00:40:28,190 So don't get confused and talk about how it's a 802 00:40:28,190 --> 00:40:29,060 wonderful fit for you. 803 00:40:29,060 --> 00:40:30,360 Because what they're interested in is whether 804 00:40:30,360 --> 00:40:32,520 you're going to contribute to their research program. 805 00:40:35,860 --> 00:40:37,490 Oh, I should say that if you're applying to artificial 806 00:40:37,490 --> 00:40:40,870 intelligence that means you don't say, I'm interested in 807 00:40:40,870 --> 00:40:42,900 all aspects of thinking. 808 00:40:42,900 --> 00:40:46,240 You need to be focused. 809 00:40:46,240 --> 00:40:47,630 There's another reason why you don't say that you're 810 00:40:47,630 --> 00:40:50,180 interested in all aspects of thinking and that is the 811 00:40:50,180 --> 00:40:53,820 defect theory of AI career selection. 812 00:40:53,820 --> 00:40:57,800 It seems to be the case, strange though it may seem, 813 00:40:57,800 --> 00:41:00,960 that people in artificial intelligence often specialize 814 00:41:00,960 --> 00:41:03,050 their research on the things that they don't do very well 815 00:41:03,050 --> 00:41:04,880 themselves. 816 00:41:04,880 --> 00:41:07,020 So people who study language, with the exception of Bob 817 00:41:07,020 --> 00:41:09,790 Berwick, often have trouble getting 818 00:41:09,790 --> 00:41:12,180 out a coherent sentence. 819 00:41:12,180 --> 00:41:16,140 And people who do hand-eye coordination are the sorts who 820 00:41:16,140 --> 00:41:18,540 spill their coffee. 821 00:41:18,540 --> 00:41:24,530 So don't say you want to study all thinking because-- 822 00:41:24,530 --> 00:41:28,390 The most extreme case of this, though, is-- 823 00:41:28,390 --> 00:41:30,370 if you don't mind, I'll tell you a story about an extreme 824 00:41:30,370 --> 00:41:31,740 case in this. 825 00:41:31,740 --> 00:41:34,360 We had a visitor from Japan in the old artificial 826 00:41:34,360 --> 00:41:35,920 intelligence lab many years ago. 827 00:41:35,920 --> 00:41:37,000 He came for a year. 828 00:41:37,000 --> 00:41:42,080 Let's call him Yoshiaki, just to pick a name. 829 00:41:42,080 --> 00:41:44,760 Yoshiaki spent a year at the artificial intelligence lab, 830 00:41:44,760 --> 00:41:48,300 and he left his wife in Japan. 831 00:41:48,300 --> 00:41:50,752 And the reason was, she was pregnant. 832 00:41:50,752 --> 00:41:54,360 And at that time, you could not get a visa to the United 833 00:41:54,360 --> 00:41:57,240 States unless you had a smallpox vaccination. 834 00:41:57,240 --> 00:41:58,990 And because she was pregnant, she didn't want to get a 835 00:41:58,990 --> 00:42:03,650 smallpox vaccination because there's a small danger to the 836 00:42:03,650 --> 00:42:05,485 fetus if you get a smallpox vaccination 837 00:42:05,485 --> 00:42:07,000 while you're pregnant. 838 00:42:07,000 --> 00:42:09,060 So she stayed back there. 839 00:42:09,060 --> 00:42:11,580 So Yoshiaki, let us call him-- 840 00:42:11,580 --> 00:42:12,960 it was a day before he was to get on the 841 00:42:12,960 --> 00:42:13,950 airplane to go home. 842 00:42:13,950 --> 00:42:17,190 I walked into his office and his desk was covered with 843 00:42:17,190 --> 00:42:19,080 pictures of his wife. 844 00:42:19,080 --> 00:42:21,630 By the way, Yoshiaki, I should tell you, is a computer vision 845 00:42:21,630 --> 00:42:24,405 guy, interested in object recognition. 846 00:42:27,070 --> 00:42:29,030 So you might suspect he has some problem. 847 00:42:29,030 --> 00:42:30,140 So he's looking at these pictures. 848 00:42:30,140 --> 00:42:31,880 I thought, oh my God, this is a tender moment. 849 00:42:31,880 --> 00:42:35,140 He's anticipating his return to Japan and 850 00:42:35,140 --> 00:42:37,140 reunion with his wife. 851 00:42:37,140 --> 00:42:39,620 So I muttered something to that effect. 852 00:42:39,620 --> 00:42:42,530 And then he looked at me like I was the king of the fools. 853 00:42:42,530 --> 00:42:45,470 And he said, it's not a question tenderness. 854 00:42:45,470 --> 00:42:49,880 I'm afraid I won't recognize her at the Tokyo airport. 855 00:42:49,880 --> 00:42:52,220 So I said, Yoshiaki. 856 00:42:52,220 --> 00:42:52,720 How can this be? 857 00:42:52,720 --> 00:42:54,260 You study computer vision. 858 00:42:54,260 --> 00:42:55,930 You study object recognition. 859 00:42:55,930 --> 00:42:58,400 This is your wife. 860 00:42:58,400 --> 00:43:00,395 How can you think you wouldn't recognize her 861 00:43:00,395 --> 00:43:03,220 at the Tokyo airport? 862 00:43:03,220 --> 00:43:04,370 And then he looks at me, and-- 863 00:43:04,370 --> 00:43:06,840 God is my witness-- he says, they all look alike. 864 00:43:16,760 --> 00:43:18,860 Well now as we come close to the end, 865 00:43:18,860 --> 00:43:19,680 what are the big questions? 866 00:43:19,680 --> 00:43:20,130 Is it useful? 867 00:43:20,130 --> 00:43:21,310 Of course it's useful. 868 00:43:21,310 --> 00:43:23,630 It's part of the toolkit, now, of everybody who claims to be 869 00:43:23,630 --> 00:43:25,310 a computer scientist. 870 00:43:25,310 --> 00:43:26,730 What are the powerful ideas and these things? 871 00:43:26,730 --> 00:43:30,290 Well, here's the most powerful, powerful idea is the 872 00:43:30,290 --> 00:43:33,200 idea of powerful idea. 873 00:43:33,200 --> 00:43:34,450 And here are a few of my favorites. 874 00:43:37,900 --> 00:43:38,930 No surprises there. 875 00:43:38,930 --> 00:43:40,180 That's just Winston's picks. 876 00:43:43,030 --> 00:43:45,340 But there's one more I would like to add. 877 00:43:45,340 --> 00:43:48,100 And that is all great ideas are simple. 878 00:43:48,100 --> 00:43:51,770 A lot of times we at MIT confuse value with complexity. 879 00:43:51,770 --> 00:43:53,660 And many of the things that were the simplest in this 880 00:43:53,660 --> 00:43:56,860 subject are actually the most powerful. 881 00:43:56,860 --> 00:44:05,160 So be careful about confusing simplicity with triviality and 882 00:44:05,160 --> 00:44:07,000 thinking that something can't be important unless it's 883 00:44:07,000 --> 00:44:09,580 complicated and deeply mathematical. 884 00:44:09,580 --> 00:44:13,270 It's usually the intuition that's powerful, and the 885 00:44:13,270 --> 00:44:15,380 mathematics is the [INAUDIBLE] element. 886 00:44:18,080 --> 00:44:22,090 Sometimes people argue that real intelligence is possible. 887 00:44:22,090 --> 00:44:24,700 One of the most common arguments is, well what if we 888 00:44:24,700 --> 00:44:26,170 had a room? 889 00:44:26,170 --> 00:44:28,630 And you're in the room and you're asked to translate some 890 00:44:28,630 --> 00:44:29,890 Chinese documents. 891 00:44:29,890 --> 00:44:31,140 You've got a bunch of books. 892 00:44:33,490 --> 00:44:34,920 And in the end you could do the translation. 893 00:44:34,920 --> 00:44:37,750 But you cannot be said to understand Chinese. 894 00:44:37,750 --> 00:44:43,260 This is the argument of Berkeley 895 00:44:43,260 --> 00:44:44,510 philosopher named Sorel. 896 00:44:46,960 --> 00:44:49,200 So the trouble is, it's also true-- 897 00:44:49,200 --> 00:44:52,950 well, the argument is the books aren't intelligent. 898 00:44:52,950 --> 00:44:54,900 They're just ink on a page. 899 00:44:54,900 --> 00:44:58,300 And the person is just a computer, just a processor. 900 00:44:58,300 --> 00:45:01,360 It doesn't actually know anything. 901 00:45:01,360 --> 00:45:03,660 So since it can't be in either the person or the 902 00:45:03,660 --> 00:45:05,436 books, it can't be. 903 00:45:05,436 --> 00:45:10,410 And that just forgets that there's a magic that comes 904 00:45:10,410 --> 00:45:14,940 about when a running program, when a process, executes in 905 00:45:14,940 --> 00:45:18,980 time over knowledge that it continually contributes to. 906 00:45:18,980 --> 00:45:22,680 So the reductionist arguments are among the many that have 907 00:45:22,680 --> 00:45:26,630 been ineffectually posed to argue that artificial 908 00:45:26,630 --> 00:45:27,880 intelligence is impossible. 909 00:45:30,170 --> 00:45:32,990 But that bears longer discussion. 910 00:45:32,990 --> 00:45:35,980 Let me just bring up the biggest issue in my mind, 911 00:45:35,980 --> 00:45:40,890 which is, it's not the question of whether we humans 912 00:45:40,890 --> 00:45:45,260 are too smart to have our intelligence duplicated or 913 00:45:45,260 --> 00:45:46,180 excelled in a computer. 914 00:45:46,180 --> 00:45:49,780 It's a question whether we're smart enough to pull it off. 915 00:45:49,780 --> 00:45:54,096 I once had a pet raccoon. 916 00:45:54,096 --> 00:45:56,110 Now, it's illegal to have a pet raccoon. 917 00:45:56,110 --> 00:45:58,710 But this one was an orphan. 918 00:45:58,710 --> 00:46:01,510 Its mother had been hit by a car or something. 919 00:46:01,510 --> 00:46:03,772 A friend of mine brought the raccoon to me knowing I kind 920 00:46:03,772 --> 00:46:05,900 of like animals. 921 00:46:05,900 --> 00:46:08,600 And I have to say, I kept the raccoon for a year. 922 00:46:08,600 --> 00:46:13,250 At that point, she wanted to go out and be on her own. 923 00:46:13,250 --> 00:46:14,840 So I had this raccoon. 924 00:46:14,840 --> 00:46:17,760 And this raccoon is smarter than any dog I've ever had. 925 00:46:17,760 --> 00:46:19,620 Within a day, she learned how to pry the 926 00:46:19,620 --> 00:46:21,670 refrigerator door open. 927 00:46:21,670 --> 00:46:24,150 So I spent that whole year taping the refrigerator door 928 00:46:24,150 --> 00:46:26,810 shut every time. 929 00:46:26,810 --> 00:46:28,460 And then, we'd play jokes on each other. 930 00:46:31,520 --> 00:46:33,440 She wouldn't eat hot dogs. 931 00:46:33,440 --> 00:46:35,030 And I wanted her to eat hot dogs desperately because 932 00:46:35,030 --> 00:46:37,880 they're cheap and easy to serve. 933 00:46:37,880 --> 00:46:41,300 All she would eat was cooked chicken wings, 934 00:46:41,300 --> 00:46:42,020 wouldn't eat hot dogs. 935 00:46:42,020 --> 00:46:44,755 So one day I said, well, I'm going to play a trick on her. 936 00:46:44,755 --> 00:46:45,950 I took a chicken bone. 937 00:46:45,950 --> 00:46:47,790 I stuck it in the middle of a hot dog and put it in a 938 00:46:47,790 --> 00:46:48,770 garbage can. 939 00:46:48,770 --> 00:46:49,800 And she went for it. 940 00:46:49,800 --> 00:46:51,705 Her genes took over and she went for it. 941 00:46:51,705 --> 00:46:55,390 And she was happy with hot dogs ever after. 942 00:46:55,390 --> 00:46:56,790 She wouldn't let me read. 943 00:46:56,790 --> 00:47:03,050 She would crawl up underneath the book and 944 00:47:03,050 --> 00:47:05,140 interfere and make me-- 945 00:47:05,140 --> 00:47:07,480 she always wanted to suck on my thumb, which turned blue 946 00:47:07,480 --> 00:47:08,070 eventually. 947 00:47:08,070 --> 00:47:12,320 You'd be amazed at how much a raccoon can suck. 948 00:47:12,320 --> 00:47:14,610 It's just extremely powerful. 949 00:47:14,610 --> 00:47:18,540 The best parts were when she would go bike riding with me. 950 00:47:18,540 --> 00:47:20,040 I put on a heavy sweater because they have 951 00:47:20,040 --> 00:47:20,900 a pretty good grip. 952 00:47:20,900 --> 00:47:24,710 I'd put on a heavy sweater and she'd kind of mount herself on 953 00:47:24,710 --> 00:47:27,440 my back and look out over my shoulder, stopped traffic for 954 00:47:27,440 --> 00:47:28,690 miles around. 955 00:47:31,460 --> 00:47:32,460 So she was really smart. 956 00:47:32,460 --> 00:47:36,960 But the interesting thing is that at no point did I ever 957 00:47:36,960 --> 00:47:39,460 presume that that raccoon was smart enough to build a 958 00:47:39,460 --> 00:47:42,420 machine that was as smart as a raccoon. 959 00:47:42,420 --> 00:47:44,790 So when we think that we can, it involves a certain element 960 00:47:44,790 --> 00:47:48,320 of hubris that may or may not be justified. 961 00:47:51,000 --> 00:47:52,730 Well, there it is. 962 00:47:52,730 --> 00:47:54,880 Just a couple more things to do. 963 00:47:54,880 --> 00:48:03,005 One of which is, you should understand that Kendra and 964 00:48:03,005 --> 00:48:09,610 Kenny and Yuan and Martin and Gleb are doing a lot of stuff 965 00:48:09,610 --> 00:48:11,846 that's outside their job description. 966 00:48:11,846 --> 00:48:17,290 All of these quiz review deals that they've arranged are not 967 00:48:17,290 --> 00:48:18,300 in their job description. 968 00:48:18,300 --> 00:48:19,800 I didn't ask them to do it. 969 00:48:19,800 --> 00:48:22,710 That's all just plain old professionalism. 970 00:48:22,710 --> 00:48:24,120 So they've been wonderful to work with and 971 00:48:24,120 --> 00:48:25,085 I'd just like to-- 972 00:48:25,085 --> 00:48:36,470 [APPLAUSE] 973 00:48:36,470 --> 00:48:38,710 --offer them a round of applause. 974 00:48:38,710 --> 00:48:44,180 And of course, Bob and Randy and Mark have done fabulous 975 00:48:44,180 --> 00:48:45,810 stuff as well. 976 00:48:45,810 --> 00:48:54,520 And we of the staff, the TAs, Mark, Bob, and Randy, have 977 00:48:54,520 --> 00:49:01,360 nothing else to do except wish you good hunting on the final 978 00:49:01,360 --> 00:49:05,350 and a good long winter solstice hibernation period 979 00:49:05,350 --> 00:49:07,270 after that. 980 00:49:07,270 --> 00:49:10,420 And that is the end of the story. 981 00:49:10,420 --> 00:49:12,620 And we hope you live happily ever after.