1 00:00:09,330 --> 00:00:10,970 PROFESSOR PATRICK WINSTON: Ladies and gentlemen, the 2 00:00:10,970 --> 00:00:13,670 engineers drinking song. 3 00:00:13,670 --> 00:00:16,309 Back in the day, I've drunk quite a lot to that song. 4 00:00:16,309 --> 00:00:20,420 And as drinking songs go, it's not bad. 5 00:00:20,420 --> 00:00:24,260 I caution you, however, before playing this song in the 6 00:00:24,260 --> 00:00:28,240 presence of small children, audition it first. 7 00:00:28,240 --> 00:00:32,460 Some of the verses are sufficiently gross as to make 8 00:00:32,460 --> 00:00:37,130 a sailor go beyond blushing. 9 00:00:37,130 --> 00:00:39,380 It's an interesting song because there are an infinite 10 00:00:39,380 --> 00:00:41,580 number of verses. 11 00:00:41,580 --> 00:00:43,840 Here's the mathematical proof. 12 00:00:43,840 --> 00:00:46,920 Suppose there were a finite number of verses. 13 00:00:46,920 --> 00:00:49,080 Then there would be a last verse. 14 00:00:49,080 --> 00:00:52,540 And if there were a last verse, then some drunk alumni 15 00:00:52,540 --> 00:00:54,570 would compose a new one. 16 00:00:54,570 --> 00:00:58,400 Therefore, there is no last verse, the size is not finite, 17 00:00:58,400 --> 00:01:02,430 and there are an infinite number of verses. 18 00:01:02,430 --> 00:01:05,610 I play it for you today because I'm an engineer. 19 00:01:05,610 --> 00:01:07,336 I like to build stuff. 20 00:01:07,336 --> 00:01:09,010 I build stuff out of wood. 21 00:01:09,010 --> 00:01:10,810 I build stuff out of metal. 22 00:01:10,810 --> 00:01:12,830 I build stuff out of rocks. 23 00:01:12,830 --> 00:01:16,470 And I especially like to write programs. 24 00:01:16,470 --> 00:01:20,180 I don't know, sometimes people come to me and say, I'm 25 00:01:20,180 --> 00:01:21,960 majoring in computer science, but I don't 26 00:01:21,960 --> 00:01:23,030 like to write programs. 27 00:01:23,030 --> 00:01:26,470 I've always been mystified by that. 28 00:01:26,470 --> 00:01:29,900 I mean, if you want to show how tough you are, you can go 29 00:01:29,900 --> 00:01:32,430 bungee jumping or drive a nail through your hand or something 30 00:01:32,430 --> 00:01:33,680 like that instead. 31 00:01:35,850 --> 00:01:38,750 But I've written quite a few programs for demonstrating 32 00:01:38,750 --> 00:01:42,650 stuff in this subject. 33 00:01:42,650 --> 00:01:46,520 They're all written in Java, principally because I can 34 00:01:46,520 --> 00:01:48,660 therefore make them available to you and to the rest of the 35 00:01:48,660 --> 00:01:50,785 world by way of Web Start. 36 00:01:50,785 --> 00:01:55,050 A few weeks ago, I was mucking around with the system and 37 00:01:55,050 --> 00:01:57,490 broke the version on the server, and within 15 minutes, 38 00:01:57,490 --> 00:02:01,490 I got an email from somebody in the depths of Anatolia 39 00:02:01,490 --> 00:02:03,960 complaining about it and asking me to bring it back up. 40 00:02:07,500 --> 00:02:09,699 This particular program is patterned 41 00:02:09,699 --> 00:02:14,830 after an early AI classic. 42 00:02:14,830 --> 00:02:19,130 And it was the business end of a program written by Terry 43 00:02:19,130 --> 00:02:24,290 Winograd, who became, and is, a professor of computer 44 00:02:24,290 --> 00:02:27,220 science at Stanford University-- 45 00:02:27,220 --> 00:02:28,940 which is on the west coast for those of you-- 46 00:02:31,990 --> 00:02:35,510 on the strength of his work on the natural language front end 47 00:02:35,510 --> 00:02:37,100 of this program. 48 00:02:37,100 --> 00:02:40,260 But the natural language part is not what makes it of 49 00:02:40,260 --> 00:02:44,079 interest for us today. 50 00:02:44,079 --> 00:02:48,440 It's more the other kinds of stuff. 51 00:02:48,440 --> 00:02:51,550 Let's pile these things up. 52 00:02:51,550 --> 00:02:55,320 Now, I'm going to ask to do something, maybe put 53 00:02:55,320 --> 00:02:58,970 B2 on top of B7. 54 00:03:07,140 --> 00:03:07,840 Not bad. 55 00:03:07,840 --> 00:03:09,320 How about B6 on B3? 56 00:03:18,500 --> 00:03:20,880 This program's kind of clever. 57 00:03:25,140 --> 00:03:26,579 Let me do one more. 58 00:03:26,579 --> 00:03:28,280 Let's put B7 on B2. 59 00:03:39,250 --> 00:03:40,510 OK, now let's see. 60 00:03:40,510 --> 00:03:42,530 Maybe B5 on B2? 61 00:03:46,170 --> 00:03:49,730 B4 on B3 first, maybe? 62 00:03:52,820 --> 00:03:55,885 Oh, I must have clicked the wrong button. 63 00:03:55,885 --> 00:03:56,675 Oh, there it goes. 64 00:03:56,675 --> 00:03:58,780 OK. 65 00:03:58,780 --> 00:04:02,716 Let's put B4 on B1. 66 00:04:02,716 --> 00:04:08,190 Agh, my mouse keeps getting out of control. 67 00:04:15,660 --> 00:04:17,519 Now, let's put B1 on B2. 68 00:04:17,519 --> 00:04:18,779 This is an example I'm actually going to 69 00:04:18,779 --> 00:04:20,029 work out on the board. 70 00:04:23,646 --> 00:04:25,740 Oh, I see. 71 00:04:25,740 --> 00:04:27,750 My touch pad accidentally got activated. 72 00:04:27,750 --> 00:04:30,440 B1 on B2. 73 00:04:44,550 --> 00:04:45,800 Now, let's ask a question. 74 00:04:58,075 --> 00:04:59,560 OK. 75 00:04:59,560 --> 00:05:00,190 Well. 76 00:05:00,190 --> 00:05:01,636 SPEAKER 2: [SINGING] 77 00:05:01,636 --> 00:05:03,564 PROFESSOR PATRICK WINSTON: Stop. 78 00:05:03,564 --> 00:05:07,420 SPEAKER 3: [LAUGHTER] 79 00:05:07,420 --> 00:05:08,866 PROFESSOR PATRICK WINSTON: Had enough of that. 80 00:05:11,760 --> 00:05:12,550 Let's see. 81 00:05:12,550 --> 00:05:15,460 Why did you put-- 82 00:05:15,460 --> 00:05:22,720 why did you want to get rid of B4? 83 00:05:22,720 --> 00:05:24,660 OK, one-- 84 00:05:24,660 --> 00:05:28,540 SPEAKER 2: [SINGING] 85 00:05:28,540 --> 00:05:34,010 PROFESSOR PATRICK WINSTON: Maybe they think, that's what 86 00:05:34,010 --> 00:05:35,575 happens when you use software you write yourself. 87 00:05:42,010 --> 00:05:51,662 Why did you want to clear the top of B2? 88 00:05:51,662 --> 00:05:55,050 Did I do that? 89 00:05:55,050 --> 00:06:01,050 Why did you clear the top of B1? 90 00:06:01,050 --> 00:06:01,520 OK. 91 00:06:01,520 --> 00:06:03,400 SPEAKER 2: [SINGING] 92 00:06:03,400 --> 00:06:06,700 SPEAKER 3: [LAUGHTER] 93 00:06:06,700 --> 00:06:08,460 PROFESSOR PATRICK WINSTON: Oh, it's haunting me. 94 00:06:08,460 --> 00:06:09,290 Yeah. 95 00:06:09,290 --> 00:06:13,350 So the drinking song is easily offended, I guess. 96 00:06:29,160 --> 00:06:31,310 But I won't develop that scenario again. 97 00:06:31,310 --> 00:06:35,260 What I want to show you is that this program looks like 98 00:06:35,260 --> 00:06:39,430 it's kind of smart, and it somehow can answer questions 99 00:06:39,430 --> 00:06:40,230 about its own behavior. 100 00:06:40,230 --> 00:06:41,800 Have you ever written a program that's answered 101 00:06:41,800 --> 00:06:44,020 questions about its own behavior? 102 00:06:44,020 --> 00:06:44,620 Probably not. 103 00:06:44,620 --> 00:06:46,322 Would you like to learn how to do that? 104 00:06:46,322 --> 00:06:47,680 OK. 105 00:06:47,680 --> 00:06:49,470 So by the end of the hour, you'll be able to write this 106 00:06:49,470 --> 00:06:53,400 program and many more like it that know how to answer 107 00:06:53,400 --> 00:06:55,040 questions about their own behavior. 108 00:06:55,040 --> 00:06:57,020 There have been tens of thousands of such programs 109 00:06:57,020 --> 00:07:00,175 written, but only by people who know the stuff I'm going 110 00:07:00,175 --> 00:07:03,080 to tell you about right now, OK? 111 00:07:03,080 --> 00:07:05,240 So what I want to do is I want to start by taking this 112 00:07:05,240 --> 00:07:08,500 program apart on the board and talking to you about the 113 00:07:08,500 --> 00:07:12,670 modules, the subroutines that it contains. 114 00:07:12,670 --> 00:07:14,070 So here it is. 115 00:07:17,440 --> 00:07:20,280 The first thing we need to think about, 116 00:07:20,280 --> 00:07:21,530 here are some blocks. 117 00:07:23,600 --> 00:07:27,500 What has to happen if I'm going to put the bottom block 118 00:07:27,500 --> 00:07:29,570 on the larger block? 119 00:07:29,570 --> 00:07:32,680 Well, first of all, I have to find space for it. 120 00:07:32,680 --> 00:07:35,340 Then I have to grasp the lower block. 121 00:07:35,340 --> 00:07:38,380 And I have to move it and I have to ungrasp it. 122 00:07:38,380 --> 00:07:42,620 So those are four things I need to do in order to achieve 123 00:07:42,620 --> 00:07:44,140 what I want to do. 124 00:07:44,140 --> 00:07:52,450 So therefore, I know that the put-on method has four pieces. 125 00:07:52,450 --> 00:07:57,480 It has to find space on the target block. 126 00:07:57,480 --> 00:08:05,500 It has to grasp the block that it's been commanded to move. 127 00:08:05,500 --> 00:08:10,580 Then it has to move, and then it has to ungrasp. 128 00:08:15,720 --> 00:08:18,680 But taking hints from some of the questions that it did 129 00:08:18,680 --> 00:08:23,350 answer before I got haunted by the music, taking our cue from 130 00:08:23,350 --> 00:08:26,850 that, we know that in order to grasp something, in this 131 00:08:26,850 --> 00:08:30,150 particular world, you can't have anything on top of it. 132 00:08:30,150 --> 00:08:36,929 So grasp, therefore, may call clear top in order to get 133 00:08:36,929 --> 00:08:41,710 stuff off from the target object. 134 00:08:41,710 --> 00:08:45,020 And that may happen in an iterative loop because there 135 00:08:45,020 --> 00:08:47,450 may be several things on top. 136 00:08:47,450 --> 00:08:49,120 And how do you get rid of stuff? 137 00:08:49,120 --> 00:08:50,370 Well, by calling get rid of. 138 00:08:54,810 --> 00:08:57,660 And that may go around a loop several times. 139 00:08:57,660 --> 00:09:04,210 And then, the way you get rid of stuff is by calling put-on. 140 00:09:04,210 --> 00:09:06,780 So that gives us recursion, and it's from the recursion 141 00:09:06,780 --> 00:09:09,110 that we get a lot of the apparent complexity of the 142 00:09:09,110 --> 00:09:13,920 program's behavior when it solves a problem. 143 00:09:13,920 --> 00:09:16,830 Now, in order to find space, you also have to 144 00:09:16,830 --> 00:09:19,070 call get rid of. 145 00:09:19,070 --> 00:09:20,820 So that's where I meant to put this other iterative 146 00:09:20,820 --> 00:09:22,270 loop, not down here. 147 00:09:22,270 --> 00:09:25,400 Cleat top has got the iterative loop inside of it. 148 00:09:25,400 --> 00:09:26,930 So that's the structure of the program. 149 00:09:26,930 --> 00:09:29,610 It's extremely simple. 150 00:09:29,610 --> 00:09:31,810 And you might say to me, well, how can you get such 151 00:09:31,810 --> 00:09:34,140 apparently complex-looking behavior out of 152 00:09:34,140 --> 00:09:35,065 such a simple program? 153 00:09:35,065 --> 00:09:36,665 A legitimate question. 154 00:09:36,665 --> 00:09:41,090 But before we tackle that one head on, I'd like to do a 155 00:09:41,090 --> 00:09:42,870 simulation of this program with a very 156 00:09:42,870 --> 00:09:44,120 simple blocks problem. 157 00:09:48,790 --> 00:09:51,380 And it's the one I almost showed you, but 158 00:09:51,380 --> 00:09:53,180 it goes like this. 159 00:09:53,180 --> 00:09:54,786 Here's B1. 160 00:09:54,786 --> 00:09:58,920 We'll call this BX because I forgot its name. 161 00:09:58,920 --> 00:10:01,250 Here's BY. 162 00:10:01,250 --> 00:10:03,380 And here's B2. 163 00:10:03,380 --> 00:10:12,080 And the task is to put B1 on B2. 164 00:10:12,080 --> 00:10:17,090 And according to our system diagram, that results in four 165 00:10:17,090 --> 00:10:19,410 calls to subroutines. 166 00:10:19,410 --> 00:10:20,660 We have to find space. 167 00:10:25,750 --> 00:10:27,995 We have to grasp B1. 168 00:10:34,270 --> 00:10:38,630 We have to move, and then we ungrasp. 169 00:10:44,120 --> 00:10:46,260 Now, the way we grasp something is the first thing 170 00:10:46,260 --> 00:10:49,900 we have to do is clear off its top. 171 00:11:03,250 --> 00:11:04,640 So grasp calls clear top. 172 00:11:08,760 --> 00:11:10,900 And clear top in turn calls get rid of. 173 00:11:18,080 --> 00:11:18,620 And let me see. 174 00:11:18,620 --> 00:11:19,650 Let me keep track of these. 175 00:11:19,650 --> 00:11:23,630 This is clearing the top of B1, and this is 176 00:11:23,630 --> 00:11:26,230 getting rid of BX. 177 00:11:26,230 --> 00:11:34,685 And the way we get rid of BX is by putting BX on the table. 178 00:11:38,710 --> 00:11:44,800 And then that in turn causes calls to another find space, 179 00:11:44,800 --> 00:11:48,314 another grasp, another move, and another ungrasp. 180 00:11:48,314 --> 00:11:51,410 So that's a little trace of the program as it works on 181 00:11:51,410 --> 00:11:53,960 this simple problem. 182 00:11:53,960 --> 00:11:56,440 So how does it go about answering the questions that I 183 00:11:56,440 --> 00:11:59,660 demonstrated to you a moment ago? 184 00:11:59,660 --> 00:12:03,340 Let's do that by using this trace. 185 00:12:03,340 --> 00:12:09,810 So how, for example, does it answer the question, why did 186 00:12:09,810 --> 00:12:11,060 you get rid of BX? 187 00:12:17,980 --> 00:12:19,520 [INAUDIBLE], what do you think? 188 00:12:19,520 --> 00:12:21,343 How can it answer that question? 189 00:12:21,343 --> 00:12:22,593 SPEAKER 4: [INAUDIBLE] 190 00:12:26,280 --> 00:12:29,480 PROFESSOR PATRICK WINSTON: So it goes up one level and 191 00:12:29,480 --> 00:12:31,590 reports what it sees. 192 00:12:31,590 --> 00:12:36,800 So it says, and said in the demonstration, I got rid of BX 193 00:12:36,800 --> 00:12:40,660 because I was trying to clear the top of B1. 194 00:12:40,660 --> 00:12:44,325 So if I were to say why did you clear the top of B1, it 195 00:12:44,325 --> 00:12:47,015 would say because I was trying to grasp it. 196 00:12:47,015 --> 00:12:49,840 If I were to say, why did you grasp B1, it would say because 197 00:12:49,840 --> 00:12:51,990 I was putting B1 on B2. 198 00:12:51,990 --> 00:12:56,120 If I say, why did you put B1 on B2, it would say, 199 00:12:56,120 --> 00:12:59,910 slavishly, because you told me to. 200 00:12:59,910 --> 00:13:03,770 OK, so that's how it deals with why questions. 201 00:13:03,770 --> 00:13:06,620 How about how questions? 202 00:13:09,410 --> 00:13:11,680 Timothy, what do you think about that one? 203 00:13:11,680 --> 00:13:13,400 How would it go about answering a question about how 204 00:13:13,400 --> 00:13:14,650 you did something? 205 00:13:17,210 --> 00:13:18,400 Do you have a thought? 206 00:13:18,400 --> 00:13:21,428 TIMOTHY: Um, yeah, it would think about what I was trying 207 00:13:21,428 --> 00:13:21,750 to accomplish. 208 00:13:21,750 --> 00:13:21,820 PROFESSOR PATRICK WINSTON: Yeah, but 209 00:13:21,820 --> 00:13:22,710 how would it do that? 210 00:13:22,710 --> 00:13:24,550 How would the program do that? 211 00:13:24,550 --> 00:13:26,150 We know that answering a why question makes 212 00:13:26,150 --> 00:13:28,710 it go up one level. 213 00:13:28,710 --> 00:13:30,660 How does it answer a how question? 214 00:13:30,660 --> 00:13:31,800 Sebastian? 215 00:13:31,800 --> 00:13:32,560 SEBASTIAN: It goes down one level. 216 00:13:32,560 --> 00:13:34,420 PROFESSOR PATRICK WINSTON: You go down one level. 217 00:13:34,420 --> 00:13:40,210 So you start off all the way up here with a put-on. 218 00:13:40,210 --> 00:13:42,360 You will say, oh, well I did these four things. 219 00:13:42,360 --> 00:13:43,890 You say, why did you grasp B1? 220 00:13:43,890 --> 00:13:46,170 It will say because I was trying to clear its top. 221 00:13:46,170 --> 00:13:46,960 Why did you clear its top? 222 00:13:46,960 --> 00:13:48,930 Because I was trying to get rid of it. 223 00:13:48,930 --> 00:13:50,080 Why were you trying to get rid of it? 224 00:13:50,080 --> 00:13:52,890 Because I was trying to put it on the table. 225 00:13:52,890 --> 00:13:56,570 So that's how it answers how questions, by going down in 226 00:13:56,570 --> 00:14:02,280 this tree and this trace of the program of action so as to 227 00:14:02,280 --> 00:14:05,100 see how things are put together. 228 00:14:05,100 --> 00:14:07,940 What are these things that are being put together? 229 00:14:07,940 --> 00:14:10,460 What's the word I've been avoiding so as to bring this 230 00:14:10,460 --> 00:14:13,360 to a crescendo? 231 00:14:13,360 --> 00:14:17,310 What are these objectives, these things it wants to do? 232 00:14:17,310 --> 00:14:19,530 They're goals. 233 00:14:19,530 --> 00:14:25,780 So this thing is leaving a trace, which is a goal tree. 234 00:14:25,780 --> 00:14:29,260 Does that sound familiar? 235 00:14:29,260 --> 00:14:31,400 Three days ago, we talked about goal trees in connection 236 00:14:31,400 --> 00:14:33,820 with integration. 237 00:14:33,820 --> 00:14:36,440 So this thing is building a goal tree, also known as an 238 00:14:36,440 --> 00:14:38,320 and-or tree. 239 00:14:38,320 --> 00:14:40,940 So this must be an and tree. 240 00:14:40,940 --> 00:14:44,300 And if this is an and tree, are there any and nodes? 241 00:14:44,300 --> 00:14:45,590 Sure, there's one right there. 242 00:14:48,860 --> 00:14:53,030 So do you think then that you can answer questions about 243 00:14:53,030 --> 00:14:56,740 your own behavior as long as you build an and-or tree? 244 00:14:56,740 --> 00:14:58,232 Sure. 245 00:14:58,232 --> 00:15:00,880 Does this mean that the integration program could 246 00:15:00,880 --> 00:15:04,110 answer questions about its own behavior? 247 00:15:04,110 --> 00:15:05,990 Sure. 248 00:15:05,990 --> 00:15:08,140 Because they both build goal trees, and wherever you got a 249 00:15:08,140 --> 00:15:10,270 goal tree, you can answer certain kinds of questions 250 00:15:10,270 --> 00:15:11,520 about your own behavior. 251 00:15:15,290 --> 00:15:22,910 So let me see if in fact it really does build itself a 252 00:15:22,910 --> 00:15:25,660 goal tree as it solves problems. 253 00:15:25,660 --> 00:15:31,560 So this time, we'll put B6 on B3 this time. 254 00:15:35,400 --> 00:15:37,000 But watch it develop its goal tree. 255 00:15:42,490 --> 00:15:44,950 So in contrast to the simple example I was working on the 256 00:15:44,950 --> 00:15:49,726 board, this gets to be a pretty complicated goal tree. 257 00:15:49,726 --> 00:15:52,550 But I could still answers questions about behavior. 258 00:15:52,550 --> 00:15:59,430 For example, I could say, why did you put B6 on B3? 259 00:16:02,440 --> 00:16:04,790 Because you told me to. 260 00:16:04,790 --> 00:16:08,090 All right, so the complexity of the behavior is largely a 261 00:16:08,090 --> 00:16:10,780 consequence not of the complexity of the program in 262 00:16:10,780 --> 00:16:13,230 this particular case, but the building of this giant goal 263 00:16:13,230 --> 00:16:17,880 tree as a consequence of the complexity of the problem. 264 00:16:17,880 --> 00:16:20,400 This brings us to one of our previous matters-- 265 00:16:20,400 --> 00:16:24,130 early on to one of the gold star ideas of today. 266 00:16:24,130 --> 00:16:29,080 And this gold star idea goes back to a lecture given in the 267 00:16:29,080 --> 00:16:32,470 late '60s by Herb Simon, who was the first Nobel Laureate 268 00:16:32,470 --> 00:16:36,030 in the pseudo Nobel Prize for economics. 269 00:16:36,030 --> 00:16:36,690 Is that right, Bob? 270 00:16:36,690 --> 00:16:37,390 Was he the first? 271 00:16:37,390 --> 00:16:39,810 All right, he was the first winner of the Nobel Prize, 272 00:16:39,810 --> 00:16:42,800 pseudo Nobel Prize in economics. 273 00:16:42,800 --> 00:16:45,170 And in this lecture, which was titled "The Sciences of the 274 00:16:45,170 --> 00:16:50,500 Artificial," he said imagine that you're looking on a beach 275 00:16:50,500 --> 00:16:53,705 at the path of a ant. 276 00:16:53,705 --> 00:16:56,580 And he said, well, you know, the path of the ant looks 277 00:16:56,580 --> 00:16:58,070 extremely complicated. 278 00:16:58,070 --> 00:17:02,640 And you're tempted to think the ant is some kind of genius 279 00:17:02,640 --> 00:17:05,670 or monster brain ant. 280 00:17:05,670 --> 00:17:10,780 But in fact, when you take a closer look, what you discover 281 00:17:10,780 --> 00:17:14,490 is that there are a bunch of pebbles on the beach, and all 282 00:17:14,490 --> 00:17:19,608 the ant is doing is avoiding those pebbles on his way home. 283 00:17:19,608 --> 00:17:23,450 So the complexity of the behavior, said Simon, is a 284 00:17:23,450 --> 00:17:25,900 consequence of the complexity of the environment, not the 285 00:17:25,900 --> 00:17:27,900 complexity of the program. 286 00:17:27,900 --> 00:17:30,620 So that's the metaphoric Simon's ant. 287 00:17:30,620 --> 00:17:39,910 And what it says is that the complexity of the behavior is 288 00:17:39,910 --> 00:17:50,270 the max of the complexity of the program and the complexity 289 00:17:50,270 --> 00:17:51,520 of the environment. 290 00:17:58,720 --> 00:18:01,960 So that's something we'll see many times during the rest of 291 00:18:01,960 --> 00:18:02,830 the semester. 292 00:18:02,830 --> 00:18:05,000 Complex behavior, simple program. 293 00:18:05,000 --> 00:18:06,105 You think it's going to be complicated. 294 00:18:06,105 --> 00:18:09,130 It turns out to be simple because the problem has the 295 00:18:09,130 --> 00:18:11,810 complexity, not the program. 296 00:18:14,350 --> 00:18:17,350 So that brings us to check box three in today's talk, and 297 00:18:17,350 --> 00:18:19,520 there's a little bit of a scene here because now I want 298 00:18:19,520 --> 00:18:24,140 to stop talking about goal-centered programming and 299 00:18:24,140 --> 00:18:28,840 start talking about rule-based expert systems. 300 00:18:28,840 --> 00:18:35,190 The rule-based expert systems were developed in a burst of 301 00:18:35,190 --> 00:18:38,870 enthusiasm about the prospects for commercial applications of 302 00:18:38,870 --> 00:18:45,530 artificial intelligence in the mid-1980s. 303 00:18:45,530 --> 00:18:50,010 At that time, it was supposed lengthy articles are written, 304 00:18:50,010 --> 00:18:52,620 but you could account for useful aspects of human 305 00:18:52,620 --> 00:18:55,980 intelligence by writing all the knowledge in the form of 306 00:18:55,980 --> 00:18:57,870 simple rules. 307 00:18:57,870 --> 00:19:02,700 So if this is true, then that's true. 308 00:19:02,700 --> 00:19:07,070 If you want to achieve this, then do that. 309 00:19:07,070 --> 00:19:09,690 But all the knowledge had to be encapsulated in the form of 310 00:19:09,690 --> 00:19:12,440 simple rules. 311 00:19:12,440 --> 00:19:13,780 So what might you want to do with this? 312 00:19:13,780 --> 00:19:15,110 All sorts of things. 313 00:19:15,110 --> 00:19:16,510 Thousands of these systems were written, 314 00:19:16,510 --> 00:19:18,760 as I indicated before. 315 00:19:18,760 --> 00:19:19,890 But here's an example. 316 00:19:19,890 --> 00:19:21,790 I'm going to work out an example having to do with 317 00:19:21,790 --> 00:19:23,870 identification. 318 00:19:23,870 --> 00:19:28,980 And this example is patterned off of a classic program, 319 00:19:28,980 --> 00:19:31,885 strangely also written at Stanford, called MYCIN. 320 00:19:31,885 --> 00:19:34,210 It was developed to diagnose bacterial 321 00:19:34,210 --> 00:19:36,470 infections in the blood. 322 00:19:36,470 --> 00:19:37,140 So you come in. 323 00:19:37,140 --> 00:19:40,800 You got some horrible disease, and the doctor gets curious 324 00:19:40,800 --> 00:19:43,880 about what antibiotic would be perfect for your disease. 325 00:19:43,880 --> 00:19:45,500 He starts asking a lot of questions. 326 00:19:49,380 --> 00:19:54,710 So I'm not going to deal with that because that world has 327 00:19:54,710 --> 00:19:59,250 all kinds of unpronounceable terms like bacterioides and 328 00:19:59,250 --> 00:20:01,310 anaerobic and stuff like that. 329 00:20:01,310 --> 00:20:04,770 So it's completely analogous to talk about identifying 330 00:20:04,770 --> 00:20:12,210 animals in a little zoo, sort of a small town type of zoo. 331 00:20:12,210 --> 00:20:15,320 So I'm going to suggest that we write down on a piece of 332 00:20:15,320 --> 00:20:18,710 paper all the things we can observe about an animal, and 333 00:20:18,710 --> 00:20:23,210 then we'll try to figure out what the animal is. 334 00:20:23,210 --> 00:20:26,610 So I don't know, what can we start with? 335 00:20:26,610 --> 00:20:27,860 Has hair. 336 00:20:34,440 --> 00:20:37,660 Then there are some characteristics of the 337 00:20:37,660 --> 00:20:39,080 following form. 338 00:20:39,080 --> 00:20:40,330 Has claws. 339 00:20:45,590 --> 00:20:46,840 Sharp teeth. 340 00:20:53,580 --> 00:20:55,060 And forward-pointing eyes. 341 00:21:04,390 --> 00:21:08,120 And these are all characteristics of carnivores. 342 00:21:08,120 --> 00:21:10,190 We happen to have forward-pointing eyes too, but 343 00:21:10,190 --> 00:21:12,560 that's more because we used to swing around the trees a lot, 344 00:21:12,560 --> 00:21:14,420 and we needed the stereo vision. 345 00:21:14,420 --> 00:21:16,530 And we don't have the claws and the sharp teeth 346 00:21:16,530 --> 00:21:18,400 that go with it. 347 00:21:18,400 --> 00:21:21,030 But anyhow, those are typically characteristics of 348 00:21:21,030 --> 00:21:23,040 carnivores, as is eating meat. 349 00:21:28,250 --> 00:21:31,090 And this particular little animal we're looking at has 350 00:21:31,090 --> 00:21:35,435 also got spots, and it's very fast. 351 00:21:39,770 --> 00:21:42,550 What is it? 352 00:21:42,550 --> 00:21:44,480 Well, everybody says it's a cheetah. 353 00:21:44,480 --> 00:21:46,970 Let's see how our program would figure that out. 354 00:21:46,970 --> 00:21:53,210 Well, program might say, let's see if it has hair, then rule 355 00:21:53,210 --> 00:21:57,985 one says that that means it must be a mammal. 356 00:22:03,690 --> 00:22:07,450 We can imagine another rule that says if you have sharp 357 00:22:07,450 --> 00:22:12,410 claws, sharp teeth, and forward-pointing eyes, then 358 00:22:12,410 --> 00:22:14,750 you're a carnivore. 359 00:22:14,750 --> 00:22:17,670 And I'm using sort of hardware notation here. 360 00:22:17,670 --> 00:22:19,940 That's an and gate, right? 361 00:22:19,940 --> 00:22:23,040 So that means we have to have all of these characteristics 362 00:22:23,040 --> 00:22:26,170 before we will conclude that the animal is a carnivore. 363 00:22:32,490 --> 00:22:36,870 Now, this animal has been also observed to eat meat. 364 00:22:36,870 --> 00:22:40,750 So that means we've got extra evidence that the animal is 365 00:22:40,750 --> 00:22:42,700 carnivorous. 366 00:22:42,700 --> 00:22:52,020 And now, because the animal is a mammal and a carnivore and 367 00:22:52,020 --> 00:22:59,700 has spots, and it's very fast, then the animal is a cheetah. 368 00:23:05,116 --> 00:23:07,850 And I hope all of our African students agree that 369 00:23:07,850 --> 00:23:09,000 it must be a cheetah. 370 00:23:09,000 --> 00:23:09,980 It's a small zoo-- 371 00:23:09,980 --> 00:23:11,420 I mean, a big zoo. 372 00:23:11,420 --> 00:23:11,830 Who knows what it is? 373 00:23:11,830 --> 00:23:13,700 It's probably got some unpronouncable name-- 374 00:23:13,700 --> 00:23:14,990 there's possibilities. 375 00:23:14,990 --> 00:23:17,010 But for our small zoo, that will do. 376 00:23:17,010 --> 00:23:20,260 So we have group now written down in the form 377 00:23:20,260 --> 00:23:22,270 of these and gates. 378 00:23:22,270 --> 00:23:27,030 Several rules, R1, R2-- and there needs to 379 00:23:27,030 --> 00:23:29,040 be an and gate here-- 380 00:23:29,040 --> 00:23:31,665 that's R3 and an R4. 381 00:23:36,950 --> 00:23:40,500 All of which indicate that this animal is a cheetah. 382 00:23:40,500 --> 00:23:43,155 So we built ourself a little rule-based expert system that 383 00:23:43,155 --> 00:23:45,400 can recognize exactly one animal, but you could imagine 384 00:23:45,400 --> 00:23:48,540 filling out this system with other rules so that you could 385 00:23:48,540 --> 00:23:50,750 recognize giraffes and penguins and all the other 386 00:23:50,750 --> 00:23:52,590 sorts of things you find in a small zoo. 387 00:23:55,890 --> 00:23:57,920 So when you have a system like this that works as I've 388 00:23:57,920 --> 00:24:01,236 indicated, then what we're going to call that, we're 389 00:24:01,236 --> 00:24:04,660 going to give that a special name, and we're going to call 390 00:24:04,660 --> 00:24:16,560 that a forward-chaining rule-based-- 391 00:24:16,560 --> 00:24:17,810 because it uses rules-- 392 00:24:26,050 --> 00:24:30,403 expert system. 393 00:24:34,320 --> 00:24:38,050 And we're going to put expert in parentheses because when 394 00:24:38,050 --> 00:24:41,280 these things were developed, for marketing reasons, they 395 00:24:41,280 --> 00:24:45,470 called them expert systems instead of novice systems. 396 00:24:45,470 --> 00:24:47,490 But are they really experts in a human sense? 397 00:24:47,490 --> 00:24:51,880 Not really, because they have these knee-jerk rules. 398 00:24:51,880 --> 00:24:53,500 They're not equipped with anything you might want to 399 00:24:53,500 --> 00:24:55,800 call common sense. 400 00:24:55,800 --> 00:24:59,700 They don't have an ability to deal with previous cases, like 401 00:24:59,700 --> 00:25:01,940 we do when we go to medical school. 402 00:25:01,940 --> 00:25:04,170 So they really ought to be called rule-based novice 403 00:25:04,170 --> 00:25:06,120 systems because they reason like novices on 404 00:25:06,120 --> 00:25:08,170 the basis of rules. 405 00:25:08,170 --> 00:25:09,585 But the tradition is to call them 406 00:25:09,585 --> 00:25:11,590 rule-based expert systems. 407 00:25:11,590 --> 00:25:18,310 And this one works forward from the facts we give it to 408 00:25:18,310 --> 00:25:20,260 the conclusion off on the right. 409 00:25:20,260 --> 00:25:22,350 That's why it's a forward-chaining system. 410 00:25:26,850 --> 00:25:28,420 Can this system answer questions 411 00:25:28,420 --> 00:25:29,670 about its own behavior? 412 00:25:34,715 --> 00:25:36,160 [INAUDIBLE], what do you think? 413 00:25:36,160 --> 00:25:36,900 SPEAKER 5: [INAUDIBLE]. 414 00:25:36,900 --> 00:25:38,010 PROFESSOR PATRICK WINSTON: Why? 415 00:25:38,010 --> 00:25:40,986 SPEAKER 5: [INAUDIBLE]. 416 00:25:40,986 --> 00:25:41,730 PROFESSOR PATRICK WINSTON: Because it 417 00:25:41,730 --> 00:25:43,466 looks like a goal tree. 418 00:25:43,466 --> 00:25:44,960 Right. 419 00:25:44,960 --> 00:25:48,130 This is, in fact, building a goal tree because each of 420 00:25:48,130 --> 00:25:55,000 these rules that require several things to be true is 421 00:25:55,000 --> 00:25:56,920 creating an and node. 422 00:25:56,920 --> 00:26:00,180 And each of these situations here where you have multiple 423 00:26:00,180 --> 00:26:02,190 reasons for believing that the thing is a carnivore, that's 424 00:26:02,190 --> 00:26:03,960 creating an or node. 425 00:26:03,960 --> 00:26:05,820 And we already know that you can answer questions about 426 00:26:05,820 --> 00:26:10,440 your own behavior if you leave behind a trace of a goal tree. 427 00:26:10,440 --> 00:26:11,620 So look at this. 428 00:26:11,620 --> 00:26:20,260 If I say to it, why were you interested in 429 00:26:20,260 --> 00:26:21,520 the animal's claws? 430 00:26:21,520 --> 00:26:25,020 Because I was trying to see if it was a carnivore. 431 00:26:25,020 --> 00:26:28,710 How did you know that the animal is a mammal? 432 00:26:28,710 --> 00:26:31,190 Because it has hair. 433 00:26:31,190 --> 00:26:33,450 Why did you think it was a cheetah? 434 00:26:33,450 --> 00:26:36,190 Because it's a mammal, a carnivore, has 435 00:26:36,190 --> 00:26:38,010 spots, and very fast. 436 00:26:38,010 --> 00:26:41,570 So by working forward and backward in this goal tree, 437 00:26:41,570 --> 00:26:44,990 this too can answer questions about its own behavior. 438 00:26:44,990 --> 00:26:47,020 So now you know how, going forward, you can write 439 00:26:47,020 --> 00:26:50,900 programs that answer questions about their behavior. 440 00:26:50,900 --> 00:26:52,950 Either you write the subroutines so that each one 441 00:26:52,950 --> 00:26:55,710 is wrapped around a goal, so you've got goal-centered 442 00:26:55,710 --> 00:26:59,420 programming, or you build a so-called expert system using 443 00:26:59,420 --> 00:27:03,010 rules, in which case it's easy to make it leave behind a 444 00:27:03,010 --> 00:27:06,180 trace of a goal tree, which makes it possible to answer 445 00:27:06,180 --> 00:27:08,670 questions about its own behavior, just as this 446 00:27:08,670 --> 00:27:09,170 [INAUDIBLE] 447 00:27:09,170 --> 00:27:10,420 program did. 448 00:27:12,290 --> 00:27:16,330 But now, a little more vocabulary. 449 00:27:16,330 --> 00:27:20,260 I'm going to save time by erasing all of these things 450 00:27:20,260 --> 00:27:22,210 that I previously drew by way of connections. 451 00:27:25,818 --> 00:27:33,610 And I'm going to approach this zoo in a little different way. 452 00:27:33,610 --> 00:27:39,360 I'm going to not ask any questions about the animal. 453 00:27:39,360 --> 00:27:43,510 Instead, I'm going to say, mommy, is this thing I'm 454 00:27:43,510 --> 00:27:46,220 looking at a cheetah? 455 00:27:46,220 --> 00:27:50,640 And how would mommy go about figuring it out. 456 00:27:50,640 --> 00:27:54,330 In her head, she would say, well, I don't know. 457 00:27:54,330 --> 00:27:58,425 If it's going to be a cheetah, then it must be the case that 458 00:27:58,425 --> 00:28:05,010 it's a carnivore, and it must be the case that it has spots. 459 00:28:05,010 --> 00:28:09,350 And it must be the case that it's very fast. 460 00:28:09,350 --> 00:28:12,210 So so far, what we've established is that if it's 461 00:28:12,210 --> 00:28:14,940 going to be a cheetah, it has to have the four 462 00:28:14,940 --> 00:28:17,880 characteristics that mommy finds behind 463 00:28:17,880 --> 00:28:20,440 this rule are four. 464 00:28:20,440 --> 00:28:24,710 So instead of working forward from facts, what I'm going to 465 00:28:24,710 --> 00:28:31,460 do is work backward from a hypothesis. 466 00:28:31,460 --> 00:28:35,720 So here the hypothesis is this thing is a cheetah. 467 00:28:35,720 --> 00:28:40,110 How do I go about showing whether that's true or not? 468 00:28:40,110 --> 00:28:44,250 Well, I haven't done anything so far because all I know is a 469 00:28:44,250 --> 00:28:46,740 cheetah if all these things are true, but are they true? 470 00:28:46,740 --> 00:28:51,160 Well, to find out if it's a mammal, I can use rule one. 471 00:28:51,160 --> 00:28:55,270 And if I know or can determine that the animal has hair, then 472 00:28:55,270 --> 00:28:57,790 that part of it is taken care of. 473 00:28:57,790 --> 00:29:01,990 And I can similarly work my way back through carnivore. 474 00:29:01,990 --> 00:29:06,870 I say, well, it's a carnivore if it has claws, sharp teeth, 475 00:29:06,870 --> 00:29:09,680 and forward-pointing eyes. 476 00:29:09,680 --> 00:29:11,860 And then as much as the animal in question 477 00:29:11,860 --> 00:29:13,720 does, then I'm through. 478 00:29:13,720 --> 00:29:14,555 I know it's a carnivore. 479 00:29:14,555 --> 00:29:16,980 I don't have to go through and show that it's a carnivore 480 00:29:16,980 --> 00:29:18,230 another way. 481 00:29:18,230 --> 00:29:20,400 So I never actually ask questions about 482 00:29:20,400 --> 00:29:23,600 whether it eats meat. 483 00:29:23,600 --> 00:29:27,720 Finally, the final two conditions are met by just an 484 00:29:27,720 --> 00:29:30,020 inspection of the animal. 485 00:29:30,020 --> 00:29:31,300 That's to say, it's in the database. 486 00:29:31,300 --> 00:29:34,400 I don't have to use any rules to determine that the animal 487 00:29:34,400 --> 00:29:37,290 has spots and is very fast. 488 00:29:37,290 --> 00:29:39,610 So now, I've got everything in place to say that it's a 489 00:29:39,610 --> 00:29:42,490 cheetah, because it's a carnivore, because it has 490 00:29:42,490 --> 00:29:45,330 claws, sharp teeth, and forward-pointing eyes, and all 491 00:29:45,330 --> 00:29:47,980 the rest of this stuff is similarly determined by going 492 00:29:47,980 --> 00:29:54,170 backwards, backwards from the hypothesis toward the facts, 493 00:29:54,170 --> 00:29:57,730 instead of from the facts forward to the conclusions. 494 00:29:57,730 --> 00:30:09,180 So building a system that works like that, I have a 495 00:30:09,180 --> 00:30:15,980 backward-chaining rule-based expert system. 496 00:30:15,980 --> 00:30:19,040 But there's a very important characteristic of this system 497 00:30:19,040 --> 00:30:22,620 in both backward and forward mode, and that is that this 498 00:30:22,620 --> 00:30:24,400 thing is a deduction system. 499 00:30:29,090 --> 00:30:31,760 That's because it's working with facts 500 00:30:31,760 --> 00:30:33,750 to produce new facts. 501 00:30:33,750 --> 00:30:35,730 When you have a deduction system, you can never take 502 00:30:35,730 --> 00:30:37,920 anything away. 503 00:30:37,920 --> 00:30:40,790 But these rule-based systems are also used in another mode, 504 00:30:40,790 --> 00:30:42,530 where it's possible to take something away. 505 00:30:42,530 --> 00:30:45,110 See, in fact world, in deduction world, you're 506 00:30:45,110 --> 00:30:46,920 talking about proving things. 507 00:30:46,920 --> 00:30:51,320 And once you prove something is true, it can't be false. 508 00:30:51,320 --> 00:30:54,100 If it is, you've got a contradiction in your system. 509 00:30:54,100 --> 00:30:56,340 But if you think of this as a programming language, if you 510 00:30:56,340 --> 00:30:59,510 think of using rules as a programming language, then you 511 00:30:59,510 --> 00:31:04,770 can think of arranging it so these rules add or subtract 512 00:31:04,770 --> 00:31:06,380 from the database. 513 00:31:06,380 --> 00:31:11,500 Let me show you an example of a couple of systems. 514 00:31:11,500 --> 00:31:15,650 First of all, since I've talked about the MYCIN system, 515 00:31:15,650 --> 00:31:41,910 let me show you an example of a MYCIN dialogue. 516 00:31:41,910 --> 00:31:43,750 That's a MYCIN dialogue. 517 00:31:43,750 --> 00:31:47,320 And you can see the appearance of words you have to go to 518 00:31:47,320 --> 00:31:48,570 medical school to learn. 519 00:31:51,000 --> 00:31:53,830 And here's a typical MYCIN rule, just like the rules for 520 00:31:53,830 --> 00:31:58,252 doing zoo analysis, only a more complicated domain. 521 00:31:58,252 --> 00:32:01,130 But here's another example of a system that was written, not 522 00:32:01,130 --> 00:32:04,300 in the '80s, but just a couple of years ago by a student in 523 00:32:04,300 --> 00:32:09,340 the architecture department, Ph.D. thesis. 524 00:32:09,340 --> 00:32:13,150 He was interested in the architecture of a Portuguese 525 00:32:13,150 --> 00:32:15,250 architect named Siza. 526 00:32:15,250 --> 00:32:18,180 And Siza's done a lot of mass housing stuff. 527 00:32:18,180 --> 00:32:20,570 And Siza has the idea that you ought to be able to design 528 00:32:20,570 --> 00:32:22,500 your own house. 529 00:32:22,500 --> 00:32:27,240 And so Jose Duarte, a Portuguese student, a Ph.D. 530 00:32:27,240 --> 00:32:31,260 student in architecture, wrote a rule-based system that was 531 00:32:31,260 --> 00:32:35,950 capable of designing Siza-like houses in response to the 532 00:32:35,950 --> 00:32:39,080 requirements and recommendations and desires of 533 00:32:39,080 --> 00:32:42,740 the people who are going to occupy the houses. 534 00:32:42,740 --> 00:32:46,150 So the most compelling part of this thing, of this exercise, 535 00:32:46,150 --> 00:32:51,780 was that Duarte took some of the designs of the program, 536 00:32:51,780 --> 00:32:55,330 mixed them up with some of the designs of Siza, and put them 537 00:32:55,330 --> 00:32:58,840 in front of Siza, and said, which ones did you do? 538 00:32:58,840 --> 00:33:01,550 And Siza couldn't tell. 539 00:33:01,550 --> 00:33:04,600 So somehow, the rule-based system that was built using 540 00:33:04,600 --> 00:33:08,530 this kind of technology was sufficient to confuse even the 541 00:33:08,530 --> 00:33:13,040 expert that they were patterned after. 542 00:33:13,040 --> 00:33:14,555 But this program is a little complicated. 543 00:33:14,555 --> 00:33:17,710 It, too, has its own specialized lingo. 544 00:33:17,710 --> 00:33:21,460 So I'm not going to talk about it in detail, but rather talk 545 00:33:21,460 --> 00:33:24,320 instead about an analogous problem. 546 00:33:24,320 --> 00:33:27,910 And that is a problem that everyone has faced at one 547 00:33:27,910 --> 00:33:31,210 point or another, and that is the problem of putting 548 00:33:31,210 --> 00:33:34,952 groceries in a bag at a grocery store. 549 00:33:34,952 --> 00:33:37,110 It's the same thing, right? 550 00:33:37,110 --> 00:33:39,770 Instead of putting rooms in a house, you're putting 551 00:33:39,770 --> 00:33:41,660 groceries in a bag. 552 00:33:41,660 --> 00:33:46,640 And there must be some rules about how to do that. 553 00:33:46,640 --> 00:33:49,910 In fact, maybe some of you have been professional grocery 554 00:33:49,910 --> 00:33:52,420 store baggers? 555 00:33:52,420 --> 00:33:54,870 [INAUDIBLE] a grocery store professional bagger. 556 00:33:54,870 --> 00:33:57,760 You're a-- 557 00:33:57,760 --> 00:33:58,480 which one? 558 00:33:58,480 --> 00:34:00,850 LISA: [INAUDIBLE] 559 00:34:00,850 --> 00:34:01,475 PROFESSOR PATRICK WINSTON: Yeah, what is your name? 560 00:34:01,475 --> 00:34:01,770 LISA: Lisa. 561 00:34:01,770 --> 00:34:03,120 PROFESSOR PATRICK WINSTON: Lisa. 562 00:34:03,120 --> 00:34:05,640 OK, well we got two professional 563 00:34:05,640 --> 00:34:07,730 grocery store baggers. 564 00:34:07,730 --> 00:34:10,429 And I'm going to be now simulating a highly paid 565 00:34:10,429 --> 00:34:13,800 knowledge engineer desirous of building a program that knows 566 00:34:13,800 --> 00:34:16,300 how to bag groceries. 567 00:34:16,300 --> 00:34:22,300 So I'm going to visit your site, Market Basket, and I'm 568 00:34:22,300 --> 00:34:26,889 going to ask Lisa, now fearful of losing her job, if she 569 00:34:26,889 --> 00:34:30,250 would tell me about how she bags groceries. 570 00:34:30,250 --> 00:34:32,780 So could you suggest a rule? 571 00:34:32,780 --> 00:34:34,226 LISA: Sure. 572 00:34:34,226 --> 00:34:36,154 Large items in the bottom. 573 00:34:36,154 --> 00:34:37,600 PROFESSOR PATRICK WINSTON: Large items in the bottom. 574 00:34:37,600 --> 00:34:39,350 You see, that's why I'm a highly paid knowledge 575 00:34:39,350 --> 00:34:43,090 engineer, because I translate what she said 576 00:34:43,090 --> 00:34:44,090 into an if-then rule. 577 00:34:44,090 --> 00:34:46,300 So if large, then bottom. 578 00:34:46,300 --> 00:34:47,370 So now I-- 579 00:34:47,370 --> 00:34:49,030 SPEAKER 3: [LAUGHTER] 580 00:34:49,030 --> 00:34:50,020 PROFESSOR PATRICK WINSTON: So how about you, [INAUDIBLE]? 581 00:34:50,020 --> 00:34:51,270 Have you got a suggestion? 582 00:34:53,870 --> 00:34:56,355 About how to bag groceries? 583 00:34:56,355 --> 00:34:59,840 SPEAKER 6: The small things on top. 584 00:34:59,840 --> 00:35:02,180 PROFESSOR PATRICK WINSTON: If small, then on top. 585 00:35:02,180 --> 00:35:04,690 Lisa, have you got anything else you could tell me? 586 00:35:04,690 --> 00:35:07,070 LISA: Don't put too many heavy things in the same bag. 587 00:35:07,070 --> 00:35:08,500 PROFESSOR PATRICK WINSTON: Don't put too many heavy 588 00:35:08,500 --> 00:35:09,750 things in the same bag. 589 00:35:09,750 --> 00:35:14,330 So if heavy greater than three, then new bag, or 590 00:35:14,330 --> 00:35:16,560 something like that. 591 00:35:16,560 --> 00:35:17,615 Is that all we're going to be able to-- does anybody else 592 00:35:17,615 --> 00:35:18,430 want to volunteer? 593 00:35:18,430 --> 00:35:20,677 [INAUDIBLE], have you bagged groceries in Turkey? 594 00:35:20,677 --> 00:35:22,665 LISA: So they don't have grocery 595 00:35:22,665 --> 00:35:26,144 baggers, so we have to-- 596 00:35:26,144 --> 00:35:27,635 PROFESSOR PATRICK WINSTON: So everybody's a professional 597 00:35:27,635 --> 00:35:28,640 bagger in Turkey. 598 00:35:28,640 --> 00:35:28,915 Yeah. 599 00:35:28,915 --> 00:35:30,165 It's outsourced to the customers. 600 00:35:32,324 --> 00:35:34,694 SPEAKER 7: So no squishies on the bottom. 601 00:35:34,694 --> 00:35:35,642 So if you have-- 602 00:35:35,642 --> 00:35:36,590 PROFESSOR PATRICK WINSTON: No squishies on the bottom. 603 00:35:36,590 --> 00:35:37,064 SPEAKER 7: If you have tomatoes-- 604 00:35:37,064 --> 00:35:38,050 PROFESSOR PATRICK WINSTON: That's good. 605 00:35:38,050 --> 00:35:38,400 Tomatoes. 606 00:35:38,400 --> 00:35:40,260 SPEAKER 7: You don't want them to get squished. 607 00:35:40,260 --> 00:35:40,880 PROFESSOR PATRICK WINSTON: Now, there's a very different 608 00:35:40,880 --> 00:35:44,070 thing about squishies and tomatoes because tomato is 609 00:35:44,070 --> 00:35:46,395 specific, and squishy isn't. 610 00:35:46,395 --> 00:35:49,090 Now, one tendency of MIT students, of course, is that 611 00:35:49,090 --> 00:35:52,270 we all tend to generalize. 612 00:35:52,270 --> 00:35:54,190 I once knew a professor in the Sloan School 613 00:35:54,190 --> 00:35:56,270 who seemed real smart. 614 00:35:56,270 --> 00:35:57,240 And-- 615 00:35:57,240 --> 00:36:00,150 SPEAKER 3: [LAUGHTER] 616 00:36:00,150 --> 00:36:00,410 PROFESSOR PATRICK WINSTON: Then I 617 00:36:00,410 --> 00:36:02,270 figured out what he did. 618 00:36:02,270 --> 00:36:06,250 If I were to say, I'm thinking about a red apple. 619 00:36:06,250 --> 00:36:10,720 They'd sit back and say, oh, I see you're contemplating 620 00:36:10,720 --> 00:36:12,330 colored fruit today. 621 00:36:12,330 --> 00:36:16,740 They're just taking it up one level of abstraction. 622 00:36:16,740 --> 00:36:17,530 Not a genius. 623 00:36:17,530 --> 00:36:20,320 He also was able to talk for an hour after he drew a 624 00:36:20,320 --> 00:36:21,460 triangle on the board. 625 00:36:21,460 --> 00:36:22,520 Amazing people. 626 00:36:22,520 --> 00:36:24,160 Anyhow, where were we? 627 00:36:24,160 --> 00:36:27,690 Oh, yes, bagging groceries. 628 00:36:27,690 --> 00:36:31,690 So we're making some progress, but not as 629 00:36:31,690 --> 00:36:32,950 much as I would like. 630 00:36:32,950 --> 00:36:37,800 And so in order to really make progress on tasks like this, 631 00:36:37,800 --> 00:36:38,790 you have to exercise-- 632 00:36:38,790 --> 00:36:43,040 you know about some principles of knowledge engineering. 633 00:36:43,040 --> 00:36:48,310 So principle number one, which I've listed over here as part 634 00:36:48,310 --> 00:36:52,950 of a gold star idea, is deal with specific cases. 635 00:36:52,950 --> 00:36:55,870 So while you're at the site, if all you do is talk to the 636 00:36:55,870 --> 00:36:59,210 experts like Lisa and [INAUDIBLE], all you're going 637 00:36:59,210 --> 00:37:01,490 to get is vague generalities because they won't think of 638 00:37:01,490 --> 00:37:03,260 everything. 639 00:37:03,260 --> 00:37:05,620 So what you do is you say, well, let me 640 00:37:05,620 --> 00:37:07,870 watch you on the line. 641 00:37:07,870 --> 00:37:09,810 And then you'll see that they have to have some way of 642 00:37:09,810 --> 00:37:12,540 dealing with the milk. 643 00:37:12,540 --> 00:37:14,220 And then you'll see that they have to have some way of 644 00:37:14,220 --> 00:37:15,600 dealing with the potato chips. 645 00:37:15,600 --> 00:37:19,150 Nobody mentioned potato chips, except insofar as they might 646 00:37:19,150 --> 00:37:20,400 be squishy. 647 00:37:22,650 --> 00:37:24,520 We don't have a definition for squishy. 648 00:37:24,520 --> 00:37:27,840 Nobody talked about the macaroni. 649 00:37:27,840 --> 00:37:29,790 And no one talked about the motor oil. 650 00:37:29,790 --> 00:37:32,050 This is a convenience store. 651 00:37:32,050 --> 00:37:35,240 I don't want that in the same bag with the meat. 652 00:37:35,240 --> 00:37:37,890 And then no one talked about canned stuff. 653 00:37:37,890 --> 00:37:40,450 Here's a can of olives. 654 00:37:40,450 --> 00:37:45,930 So by looking at specific cases, you elicit from people 655 00:37:45,930 --> 00:37:48,620 knowledge they otherwise would not have thought to give you. 656 00:37:51,210 --> 00:37:55,450 That's knowledge engineering rule number one. 657 00:37:55,450 --> 00:37:57,310 And within a very few minutes, you'll have all three 658 00:37:57,310 --> 00:37:59,480 knowledge engineering rules and be prepared to be a highly 659 00:37:59,480 --> 00:38:00,730 paid knowledge engineer. 660 00:38:03,190 --> 00:38:05,350 Heuristic, let's call these heuristics. 661 00:38:05,350 --> 00:38:07,120 Heuristic number one, specific cases. 662 00:38:07,120 --> 00:38:11,100 Heuristic number two is ask questions about things that 663 00:38:11,100 --> 00:38:15,070 appear to be the same, but are actually handled differently. 664 00:38:15,070 --> 00:38:20,888 So there's some Birds Eye frozen peas, and here-- 665 00:38:20,888 --> 00:38:27,160 ugh, some fresh cut sweet peas. 666 00:38:27,160 --> 00:38:31,540 And to me, the person who's never touched a grocery bag in 667 00:38:31,540 --> 00:38:33,900 my life-- maybe I'm from Mars-- 668 00:38:33,900 --> 00:38:34,990 I can't tell the difference. 669 00:38:34,990 --> 00:38:36,910 They're both peas. 670 00:38:36,910 --> 00:38:40,980 But I observe that the experts are handling these objects 671 00:38:40,980 --> 00:38:42,650 differently. 672 00:38:42,650 --> 00:38:45,940 So I say, why did you handle those peas differently from 673 00:38:45,940 --> 00:38:48,250 those peas, and what do they say? 674 00:38:48,250 --> 00:38:51,410 One's canned, and one's frozen. 675 00:38:51,410 --> 00:38:53,150 So what happens? 676 00:38:53,150 --> 00:38:57,280 Bingo, I've got some new words in my vocabulary. 677 00:38:57,280 --> 00:38:59,660 And those new vocabulary words are going to give me power 678 00:38:59,660 --> 00:39:01,570 over the domain because I can now use 679 00:39:01,570 --> 00:39:03,380 those words in my rules. 680 00:39:03,380 --> 00:39:07,540 And I can write rules like if frozen, then put them all 681 00:39:07,540 --> 00:39:09,910 together in a little plastic bag. 682 00:39:09,910 --> 00:39:11,660 Actually, that's too complicated, but that's what 683 00:39:11,660 --> 00:39:13,000 we end up doing, right? 684 00:39:13,000 --> 00:39:14,720 Why do we put them all together in a 685 00:39:14,720 --> 00:39:16,141 little plastic bag? 686 00:39:16,141 --> 00:39:18,105 SPEAKER 8: [INAUDIBLE] 687 00:39:18,105 --> 00:39:19,087 PROFESSOR PATRICK WINSTON: What's that? 688 00:39:19,087 --> 00:39:20,070 SPEAKER 8: [INAUDIBLE] 689 00:39:20,070 --> 00:39:20,690 PROFESSOR PATRICK WINSTON: Well, there are two 690 00:39:20,690 --> 00:39:21,940 explanations. 691 00:39:24,390 --> 00:39:26,750 There's the MIT explanation. 692 00:39:26,750 --> 00:39:31,120 We know that temperature flow is equal to the fourth power 693 00:39:31,120 --> 00:39:33,400 of the temperature difference and the surface area and all 694 00:39:33,400 --> 00:39:34,130 that kind of stuff. 695 00:39:34,130 --> 00:39:39,500 We want to get them all together in a ball, sphere. 696 00:39:39,500 --> 00:39:41,660 The normal explanation is that they're going to melt anyway, 697 00:39:41,660 --> 00:39:43,808 so they might as well not get everything else wet. 698 00:39:43,808 --> 00:39:44,784 All right. 699 00:39:44,784 --> 00:39:50,640 SPEAKER 3: [LAUGHTER] 700 00:39:50,640 --> 00:39:52,760 PROFESSOR PATRICK WINSTON: So that's heuristic number two. 701 00:39:52,760 --> 00:39:56,140 And actually, there's heuristic number three, that I 702 00:39:56,140 --> 00:39:59,860 just want to relate to you for the first time because I have 703 00:39:59,860 --> 00:40:01,770 been dealing with it a lot over this past summer. 704 00:40:01,770 --> 00:40:04,160 Heuristic number three is you build a system, and you see 705 00:40:04,160 --> 00:40:05,820 when it cracks. 706 00:40:05,820 --> 00:40:07,770 And when it cracks is when you don't have one of the rules 707 00:40:07,770 --> 00:40:11,130 you need in order to execute-- 708 00:40:11,130 --> 00:40:13,180 in order to get the program to execute as 709 00:40:13,180 --> 00:40:14,890 you want it to execute. 710 00:40:14,890 --> 00:40:17,730 So if I were to write a grocery store bagging program 711 00:40:17,730 --> 00:40:20,290 and have it bag some groceries, again, eventually 712 00:40:20,290 --> 00:40:22,680 it would either make a mistake or come to a grinding halt, 713 00:40:22,680 --> 00:40:24,350 and bingo, I know that there's a missing rule. 714 00:40:28,150 --> 00:40:32,090 Isn't that what happens when you do a problem set, and you 715 00:40:32,090 --> 00:40:34,360 hit an impasse? 716 00:40:34,360 --> 00:40:36,920 You're performing an experiment on yourself, and 717 00:40:36,920 --> 00:40:40,560 you're discovering that you don't have the whole program. 718 00:40:40,560 --> 00:40:45,140 In fact, I've listed this as a gold star idea having to do 719 00:40:45,140 --> 00:40:47,460 with engineering yourself because all of these things 720 00:40:47,460 --> 00:40:50,400 that you can do for knowledge engineering are things you can 721 00:40:50,400 --> 00:40:52,170 do when you learn a new subject yourself. 722 00:40:52,170 --> 00:40:54,730 Because essentially, you're making yourself into an expert 723 00:40:54,730 --> 00:40:56,940 system when you're learning circuit theory or 724 00:40:56,940 --> 00:41:00,230 electromagnetism or something of that sort. 725 00:41:00,230 --> 00:41:02,420 You're saying to yourself, well, let's look at some 726 00:41:02,420 --> 00:41:04,310 specific cases. 727 00:41:04,310 --> 00:41:06,870 Well, what are the vocabulary items here that tell me why 728 00:41:06,870 --> 00:41:08,450 this problem is different from that problem? 729 00:41:08,450 --> 00:41:11,380 Oh, this is a cylinder instead of a sphere. 730 00:41:11,380 --> 00:41:13,200 Or you're working with a problem set, and you discover 731 00:41:13,200 --> 00:41:16,000 you can't work with the problem, and you need to get 732 00:41:16,000 --> 00:41:17,860 another chunk of knowledge that makes it possible 733 00:41:17,860 --> 00:41:19,600 for you to do it. 734 00:41:19,600 --> 00:41:23,260 So this sort of thing, which you think of primarily as a 735 00:41:23,260 --> 00:41:26,400 mechanism, heuristics for doing knowledge engineering, 736 00:41:26,400 --> 00:41:32,515 are also mechanisms for making yourself smarter. 737 00:41:36,910 --> 00:41:40,990 So that concludes what I want to talk with you about today. 738 00:41:40,990 --> 00:41:43,320 But the bottom line is, that if you build a rule-based 739 00:41:43,320 --> 00:41:44,880 expert system, it can answer questions 740 00:41:44,880 --> 00:41:46,540 about its own behavior. 741 00:41:46,540 --> 00:41:49,850 If you build a program that's centered on goals, it can 742 00:41:49,850 --> 00:41:51,815 answer questions about its own behavior. 743 00:41:51,815 --> 00:41:54,190 If you build an integration program, it can answer 744 00:41:54,190 --> 00:41:56,120 questions about its own behavior. 745 00:41:56,120 --> 00:41:57,950 And if you want to build one of these systems, and you need 746 00:41:57,950 --> 00:42:00,760 to extract knowledge from an expert, you need to approach 747 00:42:00,760 --> 00:42:03,610 it with these kinds of heuristics because the expert 748 00:42:03,610 --> 00:42:06,320 won't think what to tell you unless you elicit that 749 00:42:06,320 --> 00:42:09,680 information by specific cases, by asking questions about 750 00:42:09,680 --> 00:42:12,760 differences, and by ultimately doing some experiments to see 751 00:42:12,760 --> 00:42:14,850 where your program is correct. 752 00:42:14,850 --> 00:42:19,960 So that really concludes what I had to say, except I want to 753 00:42:19,960 --> 00:42:25,080 ask the question, is this all we need to know about human 754 00:42:25,080 --> 00:42:26,010 intelligence? 755 00:42:26,010 --> 00:42:27,260 Can these things be-- 756 00:42:27,260 --> 00:42:30,500 are these things really smart? 757 00:42:30,500 --> 00:42:34,890 And the traditional answer is no, they're not really smart 758 00:42:34,890 --> 00:42:39,610 because their intelligence is this sort of thin veneer. 759 00:42:39,610 --> 00:42:44,090 And when you try to get underneath it, as written, 760 00:42:44,090 --> 00:42:46,160 they tend to crack. 761 00:42:46,160 --> 00:42:50,030 For example, we talk about a rule, we could talk about a 762 00:42:50,030 --> 00:42:53,290 rule that knows that you should put the potato chips on 763 00:42:53,290 --> 00:42:54,540 the top of the bag. 764 00:42:57,080 --> 00:43:00,375 But a program that knows that would have no idea why you 765 00:43:00,375 --> 00:43:03,660 would want to put the potato chips on top of the bag. 766 00:43:03,660 --> 00:43:05,610 They wouldn't know that if you put them on the bottom of the 767 00:43:05,610 --> 00:43:09,870 bag, they'll get crushed. 768 00:43:09,870 --> 00:43:13,310 And it wouldn't know that if they get crushed, the customer 769 00:43:13,310 --> 00:43:17,630 will get angry, because people don't like to eat crushed 770 00:43:17,630 --> 00:43:19,340 potato chips. 771 00:43:19,340 --> 00:43:21,280 So that's what I mean when I say the knowledge of these 772 00:43:21,280 --> 00:43:23,790 things tends to be a veneer. 773 00:43:23,790 --> 00:43:28,000 So the MYCIN program, during debugging, once prescribed a 774 00:43:28,000 --> 00:43:31,595 barrel of penicillin to be administered to a patient for 775 00:43:31,595 --> 00:43:34,340 its disease. 776 00:43:34,340 --> 00:43:38,510 They don't know, they don't have any common sense. 777 00:43:38,510 --> 00:43:43,620 So the question then becomes, well, I don't know. 778 00:43:43,620 --> 00:43:45,460 Does rule-based-- 779 00:43:45,460 --> 00:43:50,550 do rules have anything to do with common sense? 780 00:43:50,550 --> 00:43:53,340 And I'm becoming a little bit agnostic on that subject. 781 00:43:57,840 --> 00:44:00,440 Because there are certain indications, there are certain 782 00:44:00,440 --> 00:44:06,600 situations, in which rules could be said to play a role 783 00:44:06,600 --> 00:44:08,495 in our ordinary understanding of things. 784 00:44:11,250 --> 00:44:13,560 Would you like to see a demonstration? 785 00:44:13,560 --> 00:44:17,060 What I'm going to show you, when the clip speaks up-- 786 00:44:17,060 --> 00:44:21,640 well, before I make any promises, let me see if I'm 787 00:44:21,640 --> 00:44:22,910 actually connected to the web. 788 00:44:34,260 --> 00:44:36,770 MIT, good. 789 00:44:36,770 --> 00:44:38,820 MIT. 790 00:44:38,820 --> 00:44:39,700 Guest. 791 00:44:39,700 --> 00:44:40,950 Yeah, that's me. 792 00:44:53,820 --> 00:44:55,070 Sounds good. 793 00:45:23,930 --> 00:45:26,650 OK, I just tested the system, and I've seen it is actually 794 00:45:26,650 --> 00:45:28,330 connected to the web. 795 00:45:28,330 --> 00:45:32,230 And I'm going to adjust some systems options here. 796 00:45:32,230 --> 00:45:35,120 I'll get rid of the text box. 797 00:45:35,120 --> 00:45:38,160 And we'll get rid of those changes scale a little bit. 798 00:45:38,160 --> 00:45:40,780 What I'm going to do is I'm going to read a little 799 00:45:40,780 --> 00:45:44,050 synopsis of the Macbeth plot. 800 00:45:44,050 --> 00:45:45,190 You're MIT students. 801 00:45:45,190 --> 00:45:46,920 I'm sure you're all classically educated and very 802 00:45:46,920 --> 00:45:49,830 familiar with Shakespearean plots. 803 00:45:49,830 --> 00:45:52,980 So I'm going to read one. 804 00:45:52,980 --> 00:45:56,330 I'm going to read a version of a Macbeth plot. 805 00:45:56,330 --> 00:45:59,260 And it's going to go along like this. 806 00:45:59,260 --> 00:46:03,040 It's basically reading a rule base so far. 807 00:46:03,040 --> 00:46:07,910 And pretty soon, it's going to get beyond the rule base and 808 00:46:07,910 --> 00:46:09,435 start reading the Macbeth story. 809 00:46:12,320 --> 00:46:13,740 And there it is. 810 00:46:13,740 --> 00:46:14,720 It's read the Macbeth story. 811 00:46:14,720 --> 00:46:16,860 Let me show you what the Macbeth story looks like as 812 00:46:16,860 --> 00:46:18,660 it's actually retained by the system. 813 00:46:18,660 --> 00:46:19,320 That's it. 814 00:46:19,320 --> 00:46:20,570 Read that. 815 00:46:25,480 --> 00:46:27,310 OK, you ran out of time because the 816 00:46:27,310 --> 00:46:28,490 machine's already finished. 817 00:46:28,490 --> 00:46:30,830 It takes about five seconds to read this story. 818 00:46:30,830 --> 00:46:32,920 Now, as you look at this little synopsis of Macbeth, 819 00:46:32,920 --> 00:46:35,510 there are a couple things to note. 820 00:46:35,510 --> 00:46:39,090 For one thing, it says that Duncan is murdered. 821 00:46:39,090 --> 00:46:42,650 Duncan, I hope this doesn't bother you. 822 00:46:42,650 --> 00:46:44,530 Duncan is murdered by Macbeth. 823 00:46:44,530 --> 00:46:48,205 But at no time does it say that Duncan is dead. 824 00:46:48,205 --> 00:46:50,800 But you know Duncan's dead because he was murdered. 825 00:46:50,800 --> 00:46:53,168 If murdered, then dead. 826 00:46:53,168 --> 00:46:55,000 SPEAKER 3: [LAUGHTER]. 827 00:46:55,000 --> 00:46:56,640 PROFESSOR PATRICK WINSTON: So if you look a little further 828 00:46:56,640 --> 00:47:01,890 down, what you see is that Macduff kills Macbeth. 829 00:47:01,890 --> 00:47:04,260 Fourth line up from the bottom. 830 00:47:04,260 --> 00:47:08,410 Why did Macduff kill Macbeth? 831 00:47:08,410 --> 00:47:11,630 Doesn't say why in this story, but you have no trouble 832 00:47:11,630 --> 00:47:16,950 figuring out that it's because he got angry. 833 00:47:16,950 --> 00:47:19,970 And when you get angry, you don't necessarily kill 834 00:47:19,970 --> 00:47:21,300 somebody, but it's possible. 835 00:47:21,300 --> 00:47:24,190 SPEAKER 3: [LAUGHTER]. 836 00:47:24,190 --> 00:47:25,240 PROFESSOR PATRICK WINSTON: So now that you see what's in the 837 00:47:25,240 --> 00:47:28,550 story, let me take you back to this display. 838 00:47:28,550 --> 00:47:31,210 It's what we call an elaboration graph. 839 00:47:31,210 --> 00:47:33,360 And when I blow it up, you can see that there's some familiar 840 00:47:33,360 --> 00:47:35,340 looking things in there. 841 00:47:35,340 --> 00:47:38,630 For example, up here in the left-hand corner, Macbeth 842 00:47:38,630 --> 00:47:41,700 murders Duncan, right over there. 843 00:47:41,700 --> 00:47:45,510 And over here, Macduff kills Macbeth. 844 00:47:45,510 --> 00:47:50,710 And if you look at what is a consequence of that, it looks 845 00:47:50,710 --> 00:47:53,010 like there must be a rule that says if you murder somebody, 846 00:47:53,010 --> 00:47:54,670 you harm them. 847 00:47:54,670 --> 00:47:58,540 And if you murder somebody, then they're dead. 848 00:47:58,540 --> 00:48:01,270 And one reason why you might kill somebody is because they 849 00:48:01,270 --> 00:48:02,500 angered you. 850 00:48:02,500 --> 00:48:05,880 And if you go the other way, one consequence of killing 851 00:48:05,880 --> 00:48:10,160 somebody is that you harm them, and that they die too. 852 00:48:10,160 --> 00:48:13,120 And if you harm somebody, they get angry, and their state 853 00:48:13,120 --> 00:48:14,370 goes negative. 854 00:48:19,050 --> 00:48:22,550 So that suggests that there are some things that we have 855 00:48:22,550 --> 00:48:26,145 on our hands that are very compiled and very, strangely 856 00:48:26,145 --> 00:48:30,970 enough, very rule-like in their character. 857 00:48:30,970 --> 00:48:35,730 Now, to close, I'm just going to read Hamlet. 858 00:48:35,730 --> 00:48:38,900 The Hamlet demonstration is much like the Macbeth one. 859 00:48:38,900 --> 00:48:42,270 In fact, Hamlet and Macbeth are very alike in their plot. 860 00:48:42,270 --> 00:48:44,530 But there's one thing that's well-illustrated by our 861 00:48:44,530 --> 00:48:47,940 particular capturing of Hamlet here. 862 00:48:47,940 --> 00:48:50,910 And that is that you'll note that the ratio of gray stuff 863 00:48:50,910 --> 00:48:54,060 to white stuff is considerable. 864 00:48:54,060 --> 00:48:59,810 The gray stuff is stuff that has been deduced by rules. 865 00:48:59,810 --> 00:49:02,140 And the reason there's so much gray stuff in this Hamlet 866 00:49:02,140 --> 00:49:05,380 story is because everybody's related to everybody else. 867 00:49:05,380 --> 00:49:08,920 So when you kill anybody, you irritate everybody else. 868 00:49:12,690 --> 00:49:13,750 So look at that. 869 00:49:13,750 --> 00:49:16,310 A few white things, those are the things that are explicit 870 00:49:16,310 --> 00:49:18,630 in the story, and lots of gray stuff. 871 00:49:18,630 --> 00:49:22,270 So what this is suggesting is that when we tell a story, 872 00:49:22,270 --> 00:49:25,440 it's mostly a matter of controlled hallucination. 873 00:49:25,440 --> 00:49:28,560 I know what rules are in your head, so I could take 874 00:49:28,560 --> 00:49:31,030 advantage of that in telling the story and not have to tell 875 00:49:31,030 --> 00:49:33,490 you anything I'm sure you're going to know. 876 00:49:33,490 --> 00:49:36,640 And so that's why, we've discovered, that storytelling 877 00:49:36,640 --> 00:49:39,500 is largely a matter of just controlling how you're going 878 00:49:39,500 --> 00:49:41,010 along, a kind of controlled hallucination.