1 00:00:01,680 --> 00:00:04,080 The following content is provided under a Creative 2 00:00:04,080 --> 00:00:05,620 Commons license. 3 00:00:05,620 --> 00:00:07,920 Your support will help MIT OpenCourseWare 4 00:00:07,920 --> 00:00:12,280 continue to offer high quality educational resources for free. 5 00:00:12,280 --> 00:00:14,910 To make a donation or view additional materials 6 00:00:14,910 --> 00:00:18,820 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:18,820 --> 00:00:20,010 at ocw.mit.edu. 8 00:00:22,497 --> 00:00:24,580 PATRICK WINSTON: Well, I suppose my first question 9 00:00:24,580 --> 00:00:28,600 has to do with some remarks that Tony made about Rod Brooks. 10 00:00:28,600 --> 00:00:34,150 I remember Rod Brooks' work for the one great idea he had, 11 00:00:34,150 --> 00:00:36,640 which was the idea of subsumption. 12 00:00:36,640 --> 00:00:38,410 And the idea of subsumption was to take 13 00:00:38,410 --> 00:00:40,870 the notion of procedural and data 14 00:00:40,870 --> 00:00:42,760 abstraction from ordinary programming 15 00:00:42,760 --> 00:00:45,890 and elevate it to a behavior level. 16 00:00:45,890 --> 00:00:48,550 And the reason for doing that was 17 00:00:48,550 --> 00:00:52,180 that if you weren't working so well at one level, 18 00:00:52,180 --> 00:00:53,860 you would appeal to another level 19 00:00:53,860 --> 00:00:56,250 to get you out of trouble. 20 00:00:56,250 --> 00:00:58,030 So that sounds like a powerful idea to me. 21 00:00:58,030 --> 00:01:02,260 And I'm just very interested in what the panelists construe 22 00:01:02,260 --> 00:01:04,810 to be the great principles of robotics 23 00:01:04,810 --> 00:01:06,540 that have emerged since then. 24 00:01:06,540 --> 00:01:08,200 Are there great principles that we 25 00:01:08,200 --> 00:01:13,150 can talk about in a classroom without just describing 26 00:01:13,150 --> 00:01:15,322 how a particular robot works? 27 00:01:15,322 --> 00:01:16,780 STEFANIE TELLEX: So we were talking 28 00:01:16,780 --> 00:01:18,180 about this ourselves a little bit 29 00:01:18,180 --> 00:01:22,330 and sort of asking ourselves what makes a systems paper? 30 00:01:22,330 --> 00:01:23,770 And what do you write down in one 31 00:01:23,770 --> 00:01:26,950 of those papers as these general principles 32 00:01:26,950 --> 00:01:28,630 that you extract from building a system? 33 00:01:28,630 --> 00:01:30,671 Because it seems like there's two kinds of papers 34 00:01:30,671 --> 00:01:33,130 in robotics-- the systems paper, where you said, 35 00:01:33,130 --> 00:01:35,690 I built this thing and here's kind of how it works. 36 00:01:35,690 --> 00:01:38,390 Where it's hard to extract, I think, general principles 37 00:01:38,390 --> 00:01:39,041 from that. 38 00:01:39,041 --> 00:01:40,790 It's like, I built this and this and this, 39 00:01:40,790 --> 00:01:41,790 and this is what it did. 40 00:01:41,790 --> 00:01:43,780 Here's a video. 41 00:01:43,780 --> 00:01:46,109 But it does something amazing, so it's cool. 42 00:01:46,109 --> 00:01:48,400 And then there's, like, kind of algorithm papers, which 43 00:01:48,400 --> 00:01:53,130 tend to get more citations and I don't know. 44 00:01:53,130 --> 00:01:54,580 And they usually work because they 45 00:01:54,580 --> 00:01:57,970 have this chunk of knowledge, subsumption architectures, 46 00:01:57,970 --> 00:01:59,730 RRT Star's one of my favorite examples. 47 00:01:59,730 --> 00:02:01,150 It's the kind of paper, there's an algorithm, 48 00:02:01,150 --> 00:02:03,525 and then there's math that shows how the algorithm works, 49 00:02:03,525 --> 00:02:04,900 some results that they show. 50 00:02:04,900 --> 00:02:08,509 And there's this nugget that transfers from your brain-- 51 00:02:08,509 --> 00:02:10,460 to the author's brain to your brain. 52 00:02:10,460 --> 00:02:14,500 And I think it's hard to know what that nugget is when 53 00:02:14,500 --> 00:02:16,624 you've built a giant system. 54 00:02:16,624 --> 00:02:18,790 One of the things that I've been thinking about that 55 00:02:18,790 --> 00:02:20,206 might be what that might look like 56 00:02:20,206 --> 00:02:22,450 is kind of design patterns for robots. 57 00:02:22,450 --> 00:02:25,180 This is a concept from software engineering. 58 00:02:25,180 --> 00:02:27,430 It's at a higher level abstraction than a library 59 00:02:27,430 --> 00:02:30,790 or something that you share, but it's things about the ways 60 00:02:30,790 --> 00:02:32,510 that you put software together. 61 00:02:32,510 --> 00:02:34,030 So if you're a hacker, you probably 62 00:02:34,030 --> 00:02:37,539 heard of some of these patterns like singleton and facade 63 00:02:37,539 --> 00:02:38,080 and strategy. 64 00:02:38,080 --> 00:02:40,600 They have these sort of evocative names from this book, 65 00:02:40,600 --> 00:02:42,175 Gang of Four is the nickname. 66 00:02:42,175 --> 00:02:45,280 And I think there's a set of design patterns for robotics 67 00:02:45,280 --> 00:02:47,560 that we are slowly discovering. 68 00:02:47,560 --> 00:02:50,710 So when I was hanging out in Seth Teller and Nick 69 00:02:50,710 --> 00:02:53,740 Roy's for my post-doc, there was one that I really got 70 00:02:53,740 --> 00:02:56,994 in my head, which was this idea of pub/sub-- 71 00:02:56,994 --> 00:02:57,910 publish and subscribe. 72 00:02:57,910 --> 00:03:00,970 You're talking about YARP and LCM and Russ. 73 00:03:00,970 --> 00:03:03,370 They all had this idea that you don't 74 00:03:03,370 --> 00:03:07,191 want to write a function to call to get the autodetection 75 00:03:07,191 --> 00:03:07,690 results. 76 00:03:07,690 --> 00:03:09,910 You just want you detector blasting out the results 77 00:03:09,910 --> 00:03:12,090 as fast as it can all the time, and then break 78 00:03:12,090 --> 00:03:13,150 that abstraction. 79 00:03:13,150 --> 00:03:15,649 And you get a lot of robustness in exchange for that. 80 00:03:15,649 --> 00:03:17,440 I think that's a design pattern for robots. 81 00:03:17,440 --> 00:03:20,650 I think there's probably about 30 more of them. 82 00:03:20,650 --> 00:03:22,680 And I bet Russ knows a lot of them. 83 00:03:22,680 --> 00:03:25,510 And Albert and Ed-- there's people who know them maybe, 84 00:03:25,510 --> 00:03:26,719 but they're not written down. 85 00:03:26,719 --> 00:03:29,176 I think one thing I'd like to do is write some more of them 86 00:03:29,176 --> 00:03:29,770 down. 87 00:03:29,770 --> 00:03:31,255 Sorry I didn't. 88 00:03:31,255 --> 00:03:33,340 PATRICK WINSTON: Russ, what do you think? 89 00:03:33,340 --> 00:03:40,799 Your talk seemed to focus on optimization as the answer. 90 00:03:40,799 --> 00:03:42,340 RUSS TEDRAKE: It's a framework that I 91 00:03:42,340 --> 00:03:46,670 think you can guess a lot of the problems and get clear results. 92 00:03:46,670 --> 00:03:51,760 I think looking across you can point to clearer sort of ideas 93 00:03:51,760 --> 00:03:52,880 that worked very well. 94 00:03:52,880 --> 00:03:55,690 So I think for estimation Bayes Rule works really well. 95 00:03:55,690 --> 00:03:58,240 And you know, Monte Carlo estimation 96 00:03:58,240 --> 00:04:00,656 has worked really well for planning 97 00:04:00,656 --> 00:04:01,780 in high dimensional spaces. 98 00:04:01,780 --> 00:04:04,570 Somehow randomization was a magical bullet 99 00:04:04,570 --> 00:04:07,180 where people start doing RRT-type things as well 100 00:04:07,180 --> 00:04:10,870 as trajectory optimization-type things. 101 00:04:10,870 --> 00:04:13,930 I do think that the open source movement and the ability 102 00:04:13,930 --> 00:04:15,790 to write software and components and modules 103 00:04:15,790 --> 00:04:18,290 has been a huge, huge thing. 104 00:04:18,290 --> 00:04:21,910 And I do think that at the low-level control level 105 00:04:21,910 --> 00:04:24,070 is optimization-based controllers 106 00:04:24,070 --> 00:04:25,670 have been a magical thing. 107 00:04:25,670 --> 00:04:27,040 I think any one of these-- 108 00:04:27,040 --> 00:04:28,660 in any one of these sub-disciplines, 109 00:04:28,660 --> 00:04:31,120 you can point to a few real go-ahead 110 00:04:31,120 --> 00:04:35,050 ideas that have rocked our world, 111 00:04:35,050 --> 00:04:37,210 and everybody gets behind them. 112 00:04:37,210 --> 00:04:40,000 You know, I think maybe the biggest one of all, 113 00:04:40,000 --> 00:04:43,880 actually, has been LIDAR. 114 00:04:43,880 --> 00:04:45,220 And then the ability-- 115 00:04:45,220 --> 00:04:50,560 I think sensing has really come online and been so enabling 116 00:04:50,560 --> 00:04:53,440 in the last few years that I think if you look back 117 00:04:53,440 --> 00:04:55,390 at the last 15, 20 years of robotics, 118 00:04:55,390 --> 00:04:56,930 the biggest point changes I think 119 00:04:56,930 --> 00:05:00,480 has been with the sensors-- sensors upped their frame rate 120 00:05:00,480 --> 00:05:01,710 resolution, gave depth. 121 00:05:01,710 --> 00:05:07,410 When LIDAR and Kinect came out, those just changed everything. 122 00:05:07,410 --> 00:05:09,660 PATRICK WINSTON: Before we leave the subject of Brooks 123 00:05:09,660 --> 00:05:12,640 and subsumption, Tony, you brought it up, 124 00:05:12,640 --> 00:05:15,280 and there was a little exchange between you 125 00:05:15,280 --> 00:05:18,720 and Russ about why it might be useful. 126 00:05:18,720 --> 00:05:23,850 I noted that both Russ and John talked about two-- 127 00:05:23,850 --> 00:05:27,340 one respect-- one each major blunders 128 00:05:27,340 --> 00:05:29,956 that their machines made. 129 00:05:29,956 --> 00:05:32,260 Do you construe that any of Brooks's stuff 130 00:05:32,260 --> 00:05:33,940 might have been useful in avoiding 131 00:05:33,940 --> 00:05:36,756 those kinds of blunders? 132 00:05:36,756 --> 00:05:37,880 TONY PRESCOTT: Potentially. 133 00:05:37,880 --> 00:05:41,240 But I think these are really challenging things that we're 134 00:05:41,240 --> 00:05:42,610 trying to do with these robots. 135 00:05:42,610 --> 00:05:45,280 So the biomimetic approach that I take, I think, 136 00:05:45,280 --> 00:05:50,260 is partly I want to mine biology for insights 137 00:05:50,260 --> 00:05:52,840 about how to solve problems. 138 00:05:52,840 --> 00:05:56,590 And the problems that we're realizing 139 00:05:56,590 --> 00:05:58,630 are hard in robotics are the problems that 140 00:05:58,630 --> 00:06:00,230 are hard in biology as well. 141 00:06:00,230 --> 00:06:03,250 So I think we underestimated the problem 142 00:06:03,250 --> 00:06:05,390 of manipulating objects. 143 00:06:05,390 --> 00:06:08,830 But if you look in biology, what other species apart from us 144 00:06:08,830 --> 00:06:12,670 has really solved that problem to the level of dexterity? 145 00:06:12,670 --> 00:06:16,146 An octopus trunk, maybe but-- 146 00:06:16,146 --> 00:06:18,530 sorry, elephant trunk. 147 00:06:18,530 --> 00:06:23,324 So I think that these challenges take a-- 148 00:06:23,324 --> 00:06:25,490 prove to be much more difficult than we might think. 149 00:06:25,490 --> 00:06:28,690 And then other things that intuitively seem hard 150 00:06:28,690 --> 00:06:30,040 are actually quite easy. 151 00:06:30,040 --> 00:06:33,550 So the path that we're taking in some of our biomimetic robots, 152 00:06:33,550 --> 00:06:38,530 like the MiRo robot toy is to do stuff which actually people 153 00:06:38,530 --> 00:06:42,300 think looks hard, but it's relatively easy to do, 154 00:06:42,300 --> 00:06:44,420 and these are solvable problems. 155 00:06:44,420 --> 00:06:48,100 And then we can progress towards what are obviously the harder 156 00:06:48,100 --> 00:06:50,650 problems, and where the brain has dedicated 157 00:06:50,650 --> 00:06:52,270 a lot of processing power. 158 00:06:52,270 --> 00:06:55,780 And I think object manipulation, if you look in the brain, 159 00:06:55,780 --> 00:06:59,600 is massive representation for the hand. 160 00:06:59,600 --> 00:07:02,860 And there's all these extra motor systems in cortex that 161 00:07:02,860 --> 00:07:05,230 aren't there in non-mammals. 162 00:07:05,230 --> 00:07:08,570 And even in simpler mammals they don't have motor cortex. 163 00:07:08,570 --> 00:07:10,720 We've developed all these extra motor 164 00:07:10,720 --> 00:07:13,960 cortical areas, direct corticospinal projections. 165 00:07:13,960 --> 00:07:17,530 All of this is dedicated to the manipulation problem. 166 00:07:17,530 --> 00:07:21,910 So I think we're finding out what the hard problems are 167 00:07:21,910 --> 00:07:23,830 by trying to build the robots. 168 00:07:23,830 --> 00:07:26,290 And then we can maybe find out what the solutions 169 00:07:26,290 --> 00:07:27,880 are by looking at the biology. 170 00:07:27,880 --> 00:07:30,340 Because in that case, particularly, you 171 00:07:30,340 --> 00:07:32,470 have low level systems for-- 172 00:07:32,470 --> 00:07:35,560 that can do grasp, and that are there in reptiles. 173 00:07:35,560 --> 00:07:38,950 And then you have these additional systems in mammals, 174 00:07:38,950 --> 00:07:40,510 and particularly in primates, that 175 00:07:40,510 --> 00:07:43,250 can override the low-level systems to do 176 00:07:43,250 --> 00:07:45,009 dexterous control. 177 00:07:45,009 --> 00:07:46,800 PATRICK WINSTON: John, you look like you're 178 00:07:46,800 --> 00:07:47,590 eager to say something. 179 00:07:47,590 --> 00:07:49,756 JOHN LEONARD: I just want to talk about subsumption, 180 00:07:49,756 --> 00:07:51,790 because it had such a huge effect on my life 181 00:07:51,790 --> 00:07:52,600 as a grad student. 182 00:07:52,600 --> 00:07:54,580 It was sort of like the big thing in 1990 183 00:07:54,580 --> 00:07:56,380 when I was finishing up. 184 00:07:56,380 --> 00:07:59,650 But I really developed a strong aversion to it, 185 00:07:59,650 --> 00:08:02,800 and so I tried to argue with Rod back then, but not very 186 00:08:02,800 --> 00:08:03,930 successfully. 187 00:08:03,930 --> 00:08:06,070 I would say, when will you build a robot that 188 00:08:06,070 --> 00:08:07,390 knows its position? 189 00:08:07,390 --> 00:08:09,580 And he would say, I don't know my position. 190 00:08:09,580 --> 00:08:11,740 But I think some of the biological evidence, 191 00:08:11,740 --> 00:08:14,200 like the grid cells and things that maybe 192 00:08:14,200 --> 00:08:16,180 at an autonomic subconscious level 193 00:08:16,180 --> 00:08:20,490 there is sort of position information in the brain. 194 00:08:20,490 --> 00:08:22,690 But subsumption, I think, as a way 195 00:08:22,690 --> 00:08:24,760 of trying to strive for robustness where 196 00:08:24,760 --> 00:08:27,010 you build on layers, that's a great inspiration. 197 00:08:27,010 --> 00:08:31,750 But I feel that the intelligence without representation 198 00:08:31,750 --> 00:08:34,929 sort of work that Rod did, I just don't buy it. 199 00:08:34,929 --> 00:08:36,419 I think we need representation. 200 00:08:36,419 --> 00:08:38,289 PATRICK WINSTON: I guess there are two separable ideas. 201 00:08:38,289 --> 00:08:38,570 JOHN LEONARD: Yes. 202 00:08:38,570 --> 00:08:40,361 PATRICK WINSTON: The no representation idea 203 00:08:40,361 --> 00:08:41,380 and the layering idea. 204 00:08:41,380 --> 00:08:41,580 JOHN LEONARD: Yeah. 205 00:08:41,580 --> 00:08:43,030 I like the layering, and I'm less 206 00:08:43,030 --> 00:08:45,234 keen on the no representation. 207 00:08:45,234 --> 00:08:47,110 RUSS TEDRAKE: I'm curious if Giorgio-- 208 00:08:47,110 --> 00:08:49,419 I mean, you showed complicated system diagrams. 209 00:08:49,419 --> 00:08:51,460 And you're obviously doing very complicated tasks 210 00:08:51,460 --> 00:08:52,730 with a very complicated robot. 211 00:08:52,730 --> 00:08:54,855 Do you think-- do you see subsumption when you look 212 00:08:54,855 --> 00:08:55,570 at those-- 213 00:08:55,570 --> 00:08:57,320 GIORGIO METTA: Well, it's actually there. 214 00:08:57,320 --> 00:08:59,470 I don't have time to enter into the details. 215 00:08:59,470 --> 00:09:04,850 But the way it's implemented, ER, 216 00:09:04,850 --> 00:09:08,480 allows you to do subsumption or other things. 217 00:09:08,480 --> 00:09:11,710 There's a way to physically take the modules 218 00:09:11,710 --> 00:09:14,560 and, without modifying the modules, 219 00:09:14,560 --> 00:09:20,680 connect them through scripts that can insert logic 220 00:09:20,680 --> 00:09:24,430 on each module to sort of preprocess the messages 221 00:09:24,430 --> 00:09:27,970 and decide whether to subsume one of the module or not, 222 00:09:27,970 --> 00:09:30,800 or activate the various behaviors. 223 00:09:30,800 --> 00:09:33,580 So in practice, can be a subsumption architecture, 224 00:09:33,580 --> 00:09:37,480 a purer one or whatever other combination you have in mind 225 00:09:37,480 --> 00:09:38,920 for the particular task. 226 00:09:38,920 --> 00:09:40,961 RUSS TEDRAKE: Maybe a slightly different question 227 00:09:40,961 --> 00:09:42,700 is did the subsumption view of the world 228 00:09:42,700 --> 00:09:44,325 shape the way you designed your system? 229 00:09:47,640 --> 00:09:51,260 GIORGIO METTA: It may have happened without knowing, 230 00:09:51,260 --> 00:09:56,720 because that piece of software started at MIT while I 231 00:09:56,720 --> 00:09:57,650 was working with Rod. 232 00:09:57,650 --> 00:10:02,010 But we didn't try to build the subsumption architecture 233 00:10:02,010 --> 00:10:03,530 in that specific case. 234 00:10:03,530 --> 00:10:07,255 But the style, maybe of the publish-subscribe 235 00:10:07,255 --> 00:10:10,410 that we ended up doing was derived 236 00:10:10,410 --> 00:10:12,800 from subsumption in a sense. 237 00:10:12,800 --> 00:10:17,660 Was in spirit, although not in the software itself. 238 00:10:17,660 --> 00:10:23,900 But going back to whether there's a clear message that we 239 00:10:23,900 --> 00:10:26,480 can take, certainly I will subscribe 240 00:10:26,480 --> 00:10:31,730 to Stefanie message of the recyclable software, the fact 241 00:10:31,730 --> 00:10:34,190 that we can now build these modules 242 00:10:34,190 --> 00:10:36,020 and build large architectures. 243 00:10:36,020 --> 00:10:38,780 This allows doing experiments that we never dreamed 244 00:10:38,780 --> 00:10:41,130 of until a few years ago. 245 00:10:41,130 --> 00:10:46,420 So we can connect many powerful computers 246 00:10:46,420 --> 00:10:49,580 and run vision and control optimisation 247 00:10:49,580 --> 00:10:53,340 very efficiently, and especially if recycling the software. 248 00:10:53,340 --> 00:10:58,227 So you don't have to implement inverse kinematics everyday. 249 00:10:58,227 --> 00:11:00,310 PATRICK WINSTON: Well, I think the casual observer 250 00:11:00,310 --> 00:11:04,470 this afternoon, one good example being me, 251 00:11:04,470 --> 00:11:08,540 would get the sense that, with the exception of Tony, 252 00:11:08,540 --> 00:11:12,200 the other four of you are interested in the behavior, 253 00:11:12,200 --> 00:11:15,890 but not necessarily interested in understanding the biology. 254 00:11:15,890 --> 00:11:18,195 Is that a misimpression or is that correct? 255 00:11:18,195 --> 00:11:20,570 And when you mention LIDAR, for example, that's something 256 00:11:20,570 --> 00:11:21,560 I don't think I use. 257 00:11:21,560 --> 00:11:23,780 And it's doing something-- 258 00:11:23,780 --> 00:11:25,550 it's enabling a robot with a mechanism 259 00:11:25,550 --> 00:11:27,260 that is not biological. 260 00:11:27,260 --> 00:11:31,160 So to what degree are any of you interested in understanding 261 00:11:31,160 --> 00:11:35,329 the nature of biological computation? 262 00:11:35,329 --> 00:11:36,870 JOHN LEONARD: I care deeply about it. 263 00:11:36,870 --> 00:11:40,050 I just don't feel I have the tools or the bandwidth 264 00:11:40,050 --> 00:11:41,040 to really dive into it. 265 00:11:41,040 --> 00:11:46,080 But every time I talk to Matt Wilson I leave feeling in awe, 266 00:11:46,080 --> 00:11:49,470 like I wish I could clone myself and hang out across the street. 267 00:11:52,080 --> 00:11:53,180 GIORGIO METTA: Well-- 268 00:11:53,180 --> 00:11:53,680 Yeah. 269 00:11:53,680 --> 00:11:54,977 Go ahead. 270 00:11:54,977 --> 00:11:56,060 STEFANIE TELLEX: I get it. 271 00:11:56,060 --> 00:11:57,140 I kind of feel the same. 272 00:11:57,140 --> 00:11:58,140 I mean, I'm an engineer. 273 00:11:58,140 --> 00:11:58,780 I'm a hacker. 274 00:11:58,780 --> 00:11:59,650 I build things. 275 00:11:59,650 --> 00:12:02,381 And I think that's the way that I 276 00:12:02,381 --> 00:12:04,380 can make the most progress towards understanding 277 00:12:04,380 --> 00:12:07,140 intelligence is by trying to build things. 278 00:12:07,140 --> 00:12:10,180 But every time I talk to Josh, you know, 279 00:12:10,180 --> 00:12:14,277 I learn something new, and Noah and Vikash. 280 00:12:14,277 --> 00:12:15,610 PATRICK WINSTON: Say that again? 281 00:12:15,610 --> 00:12:18,068 STEFANIE TELLEX: Every time I talk to Josh and Noah Goodman 282 00:12:18,068 --> 00:12:21,280 and Bertram Malle, people from the psychology and cognitive 283 00:12:21,280 --> 00:12:23,890 science [INAUDIBLE],, I learn something 284 00:12:23,890 --> 00:12:25,980 and I take things away. 285 00:12:25,980 --> 00:12:29,502 But I don't get excited about trying to build a faithful 286 00:12:29,502 --> 00:12:31,710 model that incorporates everything that we know about 287 00:12:31,710 --> 00:12:33,084 the brain, because I just don't-- 288 00:12:33,084 --> 00:12:36,180 I can't put all that in my brain. 289 00:12:36,180 --> 00:12:38,604 And I feel that I'm better guided by my engineering 290 00:12:38,604 --> 00:12:40,020 intuition and the things that I've 291 00:12:40,020 --> 00:12:42,840 learned by trying to build systems 292 00:12:42,840 --> 00:12:44,185 and seeing how that plays out. 293 00:12:44,185 --> 00:12:45,893 PATRICK WINSTON: On the other hand, Tony, 294 00:12:45,893 --> 00:12:50,220 you are interested in biology and you do build stuff. 295 00:12:50,220 --> 00:12:51,120 TONY PRESCOTT: Yeah. 296 00:12:51,120 --> 00:12:53,203 I'm interested in it, because I trained originally 297 00:12:53,203 --> 00:12:53,970 a psychologist. 298 00:12:53,970 --> 00:12:59,250 So I came into robotics in order to build physical models. 299 00:12:59,250 --> 00:13:01,110 PATRICK WINSTON: So why do you build stuff? 300 00:13:01,110 --> 00:13:06,810 TONY PRESCOTT: Because I think the theory is the machine. 301 00:13:06,810 --> 00:13:09,920 Our theories and psychology and neuroscience 302 00:13:09,920 --> 00:13:12,560 are never going to be like theories are in physics. 303 00:13:12,560 --> 00:13:14,310 We're not going to be able to express them 304 00:13:14,310 --> 00:13:18,270 concisely and convince people of them in a short paper. 305 00:13:18,270 --> 00:13:20,190 We're going to be able to, though, build them 306 00:13:20,190 --> 00:13:24,300 into machines like robots and show people 307 00:13:24,300 --> 00:13:27,270 that they do behavior, and hopefully 308 00:13:27,270 --> 00:13:29,604 convince people that way that we have a complete theory. 309 00:13:29,604 --> 00:13:31,645 PATRICK WINSTON: So it's a demonstration purpose? 310 00:13:31,645 --> 00:13:32,750 Or a convincing purpose? 311 00:13:32,750 --> 00:13:35,280 TONY PRESCOTT: It's partly to demonstrate 312 00:13:35,280 --> 00:13:37,140 the sufficiency of the theory. 313 00:13:37,140 --> 00:13:39,420 I think that's the big reason. 314 00:13:39,420 --> 00:13:44,160 But another motivation that has grown more important for me 315 00:13:44,160 --> 00:13:49,310 is to be able to ask questions of biologists 316 00:13:49,310 --> 00:13:50,790 that that wouldn't occur to them. 317 00:13:50,790 --> 00:13:53,920 Because I think the engineering approach-- 318 00:13:53,920 --> 00:13:55,560 you're actually building something-- 319 00:13:55,560 --> 00:13:58,890 raises a lot of different questions. 320 00:13:58,890 --> 00:14:01,680 And those are then interesting questions 321 00:14:01,680 --> 00:14:05,580 to pursue in biological studies and questions that might not 322 00:14:05,580 --> 00:14:08,660 occur to you otherwise. 323 00:14:08,660 --> 00:14:10,950 So I go back to Brightenburg's comment 324 00:14:10,950 --> 00:14:13,320 that when you try to understand a system, 325 00:14:13,320 --> 00:14:16,970 there's a tendency to overestimate its complexity, 326 00:14:16,970 --> 00:14:19,350 and that when you do synthesis that's 327 00:14:19,350 --> 00:14:21,810 a whole lot different from analysis. 328 00:14:21,810 --> 00:14:24,090 And actually, with the brain, we tend 329 00:14:24,090 --> 00:14:27,540 to either underestimate or overestimate its complexity 330 00:14:27,540 --> 00:14:29,200 and we rarely get it right. 331 00:14:29,200 --> 00:14:32,130 So the things that we think are complex sometimes 332 00:14:32,130 --> 00:14:33,690 turn out to be easy. 333 00:14:33,690 --> 00:14:36,910 So an example would be in our whisker system, 334 00:14:36,910 --> 00:14:41,792 it's really quite easy to measure texture with a whisker. 335 00:14:41,792 --> 00:14:44,250 And there's lots of different ways of doing that that work. 336 00:14:44,250 --> 00:14:47,640 But intuitively you might not have thought that. 337 00:14:47,640 --> 00:14:49,440 But getting shape out of whiskers 338 00:14:49,440 --> 00:14:51,660 is harder, because it's an integration 339 00:14:51,660 --> 00:14:55,170 problem across time, and you have 340 00:14:55,170 --> 00:14:57,300 to track position and all these other things. 341 00:14:57,300 --> 00:15:00,340 So these things turn out to be hard. 342 00:15:00,340 --> 00:15:04,980 So I think trying to build synthetic systems helps 343 00:15:04,980 --> 00:15:07,560 us understand what are the real challenges the brain has 344 00:15:07,560 --> 00:15:09,437 to solve, and that's interesting for me. 345 00:15:09,437 --> 00:15:11,520 PATRICK WINSTON: Is there an example of a question 346 00:15:11,520 --> 00:15:13,615 that you didn't know is there when you started? 347 00:15:13,615 --> 00:15:13,940 TONY PRESCOTT: Yeah. 348 00:15:13,940 --> 00:15:15,315 PATRICK WINSTON: And you wouldn't 349 00:15:15,315 --> 00:15:17,160 have found if you hadn't attempted to build? 350 00:15:17,160 --> 00:15:18,360 TONY PRESCOTT: So when we started 351 00:15:18,360 --> 00:15:19,920 trying to build artificial whiskers, 352 00:15:19,920 --> 00:15:22,710 the engineers that were building the robot 353 00:15:22,710 --> 00:15:26,250 said, well, how much power does the motor have to have 354 00:15:26,250 --> 00:15:28,110 that drives the whisker? 355 00:15:28,110 --> 00:15:30,910 I mean, what happens when the whisker touches something? 356 00:15:30,910 --> 00:15:34,700 Does it continue to move and bend against the surface? 357 00:15:34,700 --> 00:15:36,190 Or does it stop? 358 00:15:36,190 --> 00:15:38,822 And well, we said we'll look that up in the literature. 359 00:15:38,822 --> 00:15:40,530 And of course, there wasn't an experiment 360 00:15:40,530 --> 00:15:42,180 that answered that question. 361 00:15:42,180 --> 00:15:44,760 So at that point we said, OK, we'll get a high speed camera 362 00:15:44,760 --> 00:15:46,920 and we'll start watching rats. 363 00:15:46,920 --> 00:15:49,350 And we found that when the whiskers touch, 364 00:15:49,350 --> 00:15:51,220 they stopped moving very quickly. 365 00:15:51,220 --> 00:15:52,840 So they make a light touch. 366 00:15:52,840 --> 00:15:56,371 And intuitively, yeah, maybe-- 367 00:15:56,371 --> 00:15:57,620 because we make a light touch. 368 00:15:57,620 --> 00:16:00,840 Obviously, we don't bash our hands against surfaces. 369 00:16:00,840 --> 00:16:02,340 But it's not obvious, necessarily, 370 00:16:02,340 --> 00:16:05,310 when you have a flexible sensor that that's what you would do. 371 00:16:05,310 --> 00:16:07,000 And in some circumstances, the rats 372 00:16:07,000 --> 00:16:09,420 allow their whiskers to bend against objects. 373 00:16:09,420 --> 00:16:13,200 So understanding when you make a light touch and when you bend 374 00:16:13,200 --> 00:16:16,080 was really a question that became important to us 375 00:16:16,080 --> 00:16:17,640 after we'd started thinking about how 376 00:16:17,640 --> 00:16:18,810 to engineer the system. 377 00:16:18,810 --> 00:16:20,675 How powerful do the motors need to be? 378 00:16:20,675 --> 00:16:22,800 PATRICK WINSTON: I'm very sympathetic to that view, 379 00:16:22,800 --> 00:16:24,420 being an engineer myself. 380 00:16:24,420 --> 00:16:26,729 I always say if you can't build it, 381 00:16:26,729 --> 00:16:28,020 you don't really understand it. 382 00:16:28,020 --> 00:16:29,020 TONY PRESCOTT: Yeah. 383 00:16:29,020 --> 00:16:30,280 PATRICK WINSTON: So many of you-- all of you 384 00:16:30,280 --> 00:16:32,140 have talked about impressive systems today. 385 00:16:32,140 --> 00:16:36,270 And I wonder if any of you would like 386 00:16:36,270 --> 00:16:39,625 to comment on some problem you didn't know that was there 387 00:16:39,625 --> 00:16:41,250 and you wouldn't have discovered if you 388 00:16:41,250 --> 00:16:49,769 hadn't been building the kinds of stuff that you have built. 389 00:16:49,769 --> 00:16:51,060 RUSS TEDRAKE: It's a long list. 390 00:16:51,060 --> 00:16:53,720 I mean, I think we learn a lot every day. 391 00:16:56,360 --> 00:16:57,280 Let me be specific. 392 00:16:57,280 --> 00:17:02,520 So with Atlas, we took a robot to a level of maturity 393 00:17:02,520 --> 00:17:05,760 that I've never taken before. 394 00:17:05,760 --> 00:17:09,119 I see videos from companies like Boston Dynamics 395 00:17:09,119 --> 00:17:10,650 that are extremely impressive. 396 00:17:10,650 --> 00:17:14,430 I think one of the things that separates a company like that 397 00:17:14,430 --> 00:17:17,790 from the results you get in a research lab 398 00:17:17,790 --> 00:17:21,540 is incredible amounts of hours, sort 399 00:17:21,540 --> 00:17:25,560 of a religion to data logging and analysis, and sort 400 00:17:25,560 --> 00:17:31,760 of finding corner cases, logging them, addressing them, 401 00:17:31,760 --> 00:17:32,760 incremental improvement. 402 00:17:32,760 --> 00:17:34,500 And researchers don't often do that. 403 00:17:34,500 --> 00:17:36,474 And actually, I think a theme that, 404 00:17:36,474 --> 00:17:37,890 at least in a couple of the talks, 405 00:17:37,890 --> 00:17:41,730 was that maybe this is actually a central requirement. 406 00:17:41,730 --> 00:17:43,650 And in some sense, our autonomy really 407 00:17:43,650 --> 00:17:47,110 should be well suited to doing that, 408 00:17:47,110 --> 00:17:49,500 to maybe automatically finding corner cases 409 00:17:49,500 --> 00:17:51,880 and proving robustness and all these things. 410 00:17:51,880 --> 00:18:01,080 But the places that broke our theory were weird. 411 00:18:01,080 --> 00:18:04,620 I mean, so the stiction in the joints of Atlas 412 00:18:04,620 --> 00:18:06,000 is just dominant. 413 00:18:06,000 --> 00:18:07,500 So we do torque control, but we have 414 00:18:07,500 --> 00:18:09,958 to send a feedforward velocity signal to get over friction. 415 00:18:09,958 --> 00:18:13,170 If we're started at zero and we have to start moving, 416 00:18:13,170 --> 00:18:15,390 if we don't send a feedforward velocity signal, 417 00:18:15,390 --> 00:18:18,780 our model is just completely wrong. 418 00:18:18,780 --> 00:18:20,570 When you're walking on cinder blocks 419 00:18:20,570 --> 00:18:25,600 and you go near the ankle limit in pitch, 420 00:18:25,600 --> 00:18:27,600 there's a strange coupling between the mechanism 421 00:18:27,600 --> 00:18:28,980 which causes the ankle to roll. 422 00:18:28,980 --> 00:18:30,896 And it'll kick your robot over just like that, 423 00:18:30,896 --> 00:18:32,120 if you don't watch for it. 424 00:18:32,120 --> 00:18:34,380 And we thought about putting that into our model, 425 00:18:34,380 --> 00:18:36,420 addressing it with sophisticated things. 426 00:18:36,420 --> 00:18:38,050 It's hard and ugly and gross. 427 00:18:38,050 --> 00:18:42,636 And it's possible, but we do things to just-- 428 00:18:42,636 --> 00:18:44,010 you know, Band-Aid solution that. 429 00:18:44,010 --> 00:18:48,570 And I think there's all this stuff, all these details that 430 00:18:48,570 --> 00:18:49,920 come in. 431 00:18:49,920 --> 00:18:52,140 I think the theory should address it all. 432 00:18:52,140 --> 00:18:53,580 I think we did pretty well. 433 00:18:53,580 --> 00:18:55,170 I'd say we got 80% of the way there, 434 00:18:55,170 --> 00:18:57,810 70%, 80% of the way there with our theory this time. 435 00:18:57,810 --> 00:18:59,970 And then we just decided that there was a deadline 436 00:18:59,970 --> 00:19:02,077 and we had to cover some stuff up with band-aids. 437 00:19:02,077 --> 00:19:03,160 But that's the good stuff. 438 00:19:03,160 --> 00:19:04,985 That's the stuff we should be focused on. 439 00:19:04,985 --> 00:19:07,110 That's the stuff we should be devoting our research 440 00:19:07,110 --> 00:19:07,890 efforts on. 441 00:19:07,890 --> 00:19:10,890 PATRICK WINSTON: If you were to go into a log cabin today 442 00:19:10,890 --> 00:19:14,451 and write a book on Atlas and your work on it, 443 00:19:14,451 --> 00:19:16,700 what fraction of that book would be about corner cases 444 00:19:16,700 --> 00:19:19,800 and which fraction would be about principles? 445 00:19:19,800 --> 00:19:24,992 RUSS TEDRAKE: We really stuck to principles until last November. 446 00:19:24,992 --> 00:19:25,950 That was our threshold. 447 00:19:25,950 --> 00:19:27,870 November, we had to send the robot back for upgrades. 448 00:19:27,870 --> 00:19:30,120 We said, until then, we're going to do research. 449 00:19:30,120 --> 00:19:32,580 The code base is going to be clean. 450 00:19:32,580 --> 00:19:36,060 And then when we got the robot back in January, 451 00:19:36,060 --> 00:19:38,430 we did everything we needed to to make the robot compete 452 00:19:38,430 --> 00:19:40,710 in the challenge. 453 00:19:40,710 --> 00:19:43,030 So I think 70% or 80% of the way, we got there. 454 00:19:43,030 --> 00:19:45,210 And that it was just putting hours on the robot, 455 00:19:45,210 --> 00:19:47,370 finding those screw cases. 456 00:19:47,370 --> 00:19:49,860 And then, if I were to write a book in five years, 457 00:19:49,860 --> 00:19:51,582 I hope it would be-- 458 00:19:51,582 --> 00:19:54,329 PATRICK WINSTON: Is there a principle on that ankle roll? 459 00:19:54,329 --> 00:19:54,870 Or was that-- 460 00:19:54,870 --> 00:19:55,350 RUSS TEDRAKE: Oh, absolutely. 461 00:19:55,350 --> 00:19:57,240 We could have thrown that into the model. 462 00:19:57,240 --> 00:19:59,240 It just would have increased the dimensionality. 463 00:19:59,240 --> 00:20:01,180 It would have been a non-linear term. 464 00:20:01,180 --> 00:20:03,625 We could have done it, we just didn't have time 465 00:20:03,625 --> 00:20:04,500 to do it at the time. 466 00:20:04,500 --> 00:20:06,300 And it was going to be one of many things 467 00:20:06,300 --> 00:20:08,700 we would have had to do if we had taken the principle 468 00:20:08,700 --> 00:20:10,260 approach throughout. 469 00:20:10,260 --> 00:20:12,480 There's other things that we couldn't have put nicely 470 00:20:12,480 --> 00:20:16,690 into the model, that we would have needed to address. 471 00:20:16,690 --> 00:20:18,970 And that should be our agenda. 472 00:20:18,970 --> 00:20:21,060 TONY PRESCOTT: If that was your own robot, 473 00:20:21,060 --> 00:20:22,990 would you have just re-engineered the ankle 474 00:20:22,990 --> 00:20:25,830 to make that problem less of an issue? 475 00:20:25,830 --> 00:20:27,510 RUSS TEDRAKE: It wasn't about the ankle. 476 00:20:27,510 --> 00:20:29,010 It was about the fact that there's always 477 00:20:29,010 --> 00:20:30,690 going to be something unmodeled that's 478 00:20:30,690 --> 00:20:32,940 going to come up and get you. 479 00:20:32,940 --> 00:20:35,700 And with robots, I think we're data starved. 480 00:20:35,700 --> 00:20:38,670 We don't have the big data problem in robotics yet. 481 00:20:38,670 --> 00:20:40,890 And I think you're limited by the hours you 482 00:20:40,890 --> 00:20:41,970 can put on your robot. 483 00:20:41,970 --> 00:20:46,050 We need to think about how do you aggressively 484 00:20:46,050 --> 00:20:48,180 search for the cases that are going to get you? 485 00:20:48,180 --> 00:20:52,350 How do you prove robustness to unmodeled things? 486 00:20:52,350 --> 00:20:55,650 I think this is fundamental. 487 00:20:55,650 --> 00:20:58,350 It's not a theme I would have prioritized if I hadn't 488 00:20:58,350 --> 00:20:59,569 gotten this far with a robot. 489 00:20:59,569 --> 00:21:01,860 PATRICK WINSTON: But Giorgio, what about building iCub? 490 00:21:01,860 --> 00:21:04,410 Were there big problems that emerged 491 00:21:04,410 --> 00:21:07,650 in the course of building that you hadn't anticipated? 492 00:21:07,650 --> 00:21:10,160 GIORGIO METTA: Well first of all, 493 00:21:10,160 --> 00:21:13,020 there's a problem of power. 494 00:21:13,020 --> 00:21:14,640 I guess for Atlas it's very different. 495 00:21:14,640 --> 00:21:19,710 But for electric motors, you're always short of space, 496 00:21:19,710 --> 00:21:23,340 short of space where to put the actuators. 497 00:21:23,340 --> 00:21:26,790 And you start filling the available-- if you 498 00:21:26,790 --> 00:21:29,860 have a target size, you start filling it very quickly. 499 00:21:29,860 --> 00:21:32,890 And there's no room for anything else. 500 00:21:32,890 --> 00:21:36,270 And then you start sticking the electronics wherever you can, 501 00:21:36,270 --> 00:21:40,020 and cables and everything. 502 00:21:40,020 --> 00:21:46,500 Certainly if-- I mean, a difference 503 00:21:46,500 --> 00:21:52,270 in design from the biological actuators and the artificial 504 00:21:52,270 --> 00:21:55,060 actuators to make life very difficult. 505 00:21:55,060 --> 00:21:58,570 And especially when you have to go through something 506 00:21:58,570 --> 00:22:00,370 like a hand, where you like to have 507 00:22:00,370 --> 00:22:01,930 a lot of degrees of freedom. 508 00:22:01,930 --> 00:22:05,340 But there's no way you could actually build it, 509 00:22:05,340 --> 00:22:07,990 so you have to take shortcuts here and there. 510 00:22:07,990 --> 00:22:12,480 And I guess the same is true, then, for computation. 511 00:22:12,480 --> 00:22:16,960 And you resort to putting as many micro-controllers 512 00:22:16,960 --> 00:22:18,940 as you can inside the robot, because you 513 00:22:18,940 --> 00:22:21,130 want to have efficient control loops. 514 00:22:21,130 --> 00:22:24,730 And then you say, well, maybe have a cable 515 00:22:24,730 --> 00:22:28,480 for a proper image processing because there's 516 00:22:28,480 --> 00:22:31,740 no way you can squeeze that into the robot itself. 517 00:22:31,740 --> 00:22:33,220 It's not surprising. 518 00:22:33,220 --> 00:22:36,260 It's just a matter of when you're doing to design, 519 00:22:36,260 --> 00:22:39,670 you soon discover that there are limitations 520 00:22:39,670 --> 00:22:41,400 you have to take into account. 521 00:22:41,400 --> 00:22:43,280 I don't know whether it is surprising. 522 00:22:43,280 --> 00:22:46,030 I mean, I guess we learn the lesson 523 00:22:46,030 --> 00:22:48,610 across many years of design. 524 00:22:48,610 --> 00:22:50,960 We designed other robots before the iCub. 525 00:22:50,960 --> 00:22:53,290 We sort of-- 526 00:22:53,290 --> 00:22:56,730 I thought we knew where the limits where 527 00:22:56,730 --> 00:22:58,024 with the current technology. 528 00:22:58,024 --> 00:22:59,440 PATRICK WINSTON: I wonder if the-- 529 00:22:59,440 --> 00:23:01,570 you say you learned a lot building iCub. 530 00:23:01,570 --> 00:23:06,790 I wonder if this knowledge is accessible. 531 00:23:06,790 --> 00:23:08,670 It's knowledge that you discussed 532 00:23:08,670 --> 00:23:12,450 in meetings and seminars, and thought about at night 533 00:23:12,450 --> 00:23:13,760 and fixed it the next day. 534 00:23:13,760 --> 00:23:15,400 Is any of it-- 535 00:23:15,400 --> 00:23:18,970 if I wanted to build iCub and couldn't talk to you, 536 00:23:18,970 --> 00:23:22,505 would I have to start from scratch? 537 00:23:22,505 --> 00:23:24,880 I know you've got the stuff on the web and whatnot, but-- 538 00:23:24,880 --> 00:23:25,120 GIORGIO METTA: Yeah. 539 00:23:25,120 --> 00:23:26,304 That's probably enough for-- 540 00:23:26,304 --> 00:23:27,720 PATRICK WINSTON: --reasons in them 541 00:23:27,720 --> 00:23:28,594 GIORGIO METTA: Sorry? 542 00:23:28,594 --> 00:23:30,880 PATRICK WINSTON: Your web material has designs, 543 00:23:30,880 --> 00:23:32,290 but it doesn't have reasons. 544 00:23:32,290 --> 00:23:33,123 GIORGIO METTA: Yeah. 545 00:23:33,123 --> 00:23:34,360 Yeah, that's-- that's right. 546 00:23:34,360 --> 00:23:39,190 No, the other thing we documented the process itself. 547 00:23:39,190 --> 00:23:42,910 So that information, I don't know, 548 00:23:42,910 --> 00:23:46,810 resides in the people that actually made the choices when 549 00:23:46,810 --> 00:23:48,310 we were doing the design. 550 00:23:48,310 --> 00:23:52,650 There's one other thing that maybe is important, is that-- 551 00:23:52,650 --> 00:23:56,290 so the iCub is about 5,000 parts. 552 00:23:56,290 --> 00:23:58,240 And that's not good, because there 553 00:23:58,240 --> 00:24:00,730 are 5,000 parts that can break. 554 00:24:00,730 --> 00:24:06,610 And that may be something interesting for design 555 00:24:06,610 --> 00:24:11,635 of materials for robots, or new ways of building the robots. 556 00:24:14,620 --> 00:24:16,390 And at the moment, basically everything 557 00:24:16,390 --> 00:24:20,000 that could potentially break has happened, 558 00:24:20,000 --> 00:24:26,150 that it failed on the iCub over many years. 559 00:24:26,150 --> 00:24:30,310 Even parts that theoretically we didn't think could break, 560 00:24:30,310 --> 00:24:31,720 well, they could. 561 00:24:31,720 --> 00:24:35,440 But we estimated maximum torques, 562 00:24:35,440 --> 00:24:37,210 whatever, and then it happened. 563 00:24:37,210 --> 00:24:41,020 Somebody did something silly and we broke a shoulder. 564 00:24:41,020 --> 00:24:44,520 It's a steel part that we never thought 565 00:24:44,520 --> 00:24:46,550 it could actually break. 566 00:24:46,550 --> 00:24:50,080 And it completely failed. 567 00:24:50,080 --> 00:24:54,790 I mean, those type of things are maybe interesting 568 00:24:54,790 --> 00:24:58,250 for future designs, or for either simplifying 569 00:24:58,250 --> 00:25:03,490 the number of parts or figuring out ways of doing less parts 570 00:25:03,490 --> 00:25:08,170 or, let's say, different ways of actually building 571 00:25:08,170 --> 00:25:09,834 the mechanics of the robot. 572 00:25:09,834 --> 00:25:11,500 PATRICK WINSTON: I suppose I bring it up 573 00:25:11,500 --> 00:25:15,850 because some of us in CSAIL are addressing-- 574 00:25:15,850 --> 00:25:17,380 not me, but some people in CSAIL are 575 00:25:17,380 --> 00:25:20,560 interested in how you capture design rationale, 576 00:25:20,560 --> 00:25:22,195 how you capture those conversations, 577 00:25:22,195 --> 00:25:24,580 those whiteboard sketches and all of that sort of thing 578 00:25:24,580 --> 00:25:28,900 so that the next generation can learn by some mechanism other 579 00:25:28,900 --> 00:25:32,260 than apprenticeship. 580 00:25:32,260 --> 00:25:33,010 But let's see. 581 00:25:33,010 --> 00:25:35,570 Where to go from here? 582 00:25:35,570 --> 00:25:37,900 iCub is obviously a major undertaking 583 00:25:37,900 --> 00:25:39,760 and Russ had been working like a slave 584 00:25:39,760 --> 00:25:41,590 for three years on Atlas robot. 585 00:25:44,110 --> 00:25:47,006 Do you-- I don't know quite how to phrase this 586 00:25:47,006 --> 00:25:48,130 without being too trumpish. 587 00:25:48,130 --> 00:25:51,400 But the soldering time to thinking time 588 00:25:51,400 --> 00:25:56,950 must be very high on projects like this. 589 00:25:56,950 --> 00:25:58,060 Is that your sense? 590 00:25:58,060 --> 00:26:00,910 Or do you think that the building of these things 591 00:26:00,910 --> 00:26:04,690 is actually essential to working out the ideas? 592 00:26:04,690 --> 00:26:07,420 Maybe that's not quite the question I'm going to ask. 593 00:26:07,420 --> 00:26:09,880 Maybe a sharper question is given the high ratio 594 00:26:09,880 --> 00:26:11,442 of soldering time to thinking time, 595 00:26:11,442 --> 00:26:13,150 is it something that a student should do? 596 00:26:16,450 --> 00:26:19,150 RUSS TEDRAKE: I'm lucky that someone else built 597 00:26:19,150 --> 00:26:20,280 the robot for us. 598 00:26:20,280 --> 00:26:23,685 Giorgio has done much more than we have in this regard. 599 00:26:23,685 --> 00:26:25,810 PATRICK WINSTON: Well, by soldering time you know-- 600 00:26:25,810 --> 00:26:26,110 RUSS TEDRAKE: I know. 601 00:26:26,110 --> 00:26:26,901 Yeah, yeah, sure. 602 00:26:26,901 --> 00:26:27,400 We-- 603 00:26:27,400 --> 00:26:28,240 PATRICK WINSTON: It's a metaphor. 604 00:26:28,240 --> 00:26:29,300 RUSS TEDRAKE: Yeah. 605 00:26:29,300 --> 00:26:34,570 but we got pretty far into it with the good graces 606 00:26:34,570 --> 00:26:38,530 of DARPA and Google slash Boston Dynamics. 607 00:26:38,530 --> 00:26:41,570 The software is where we've invested our solder time. 608 00:26:41,570 --> 00:26:43,950 A huge amount of software engineering effort. 609 00:26:43,950 --> 00:26:47,150 I spent countless hours on setting up 610 00:26:47,150 --> 00:26:48,300 build servers and stuff. 611 00:26:51,495 --> 00:26:54,510 Am I stronger, you know, am I better for it? 612 00:26:54,510 --> 00:26:59,230 I think having invested, we can do research very fast now. 613 00:26:59,230 --> 00:27:04,050 So I'm in a new position to be able to try 614 00:27:04,050 --> 00:27:06,270 really complicated ideas very quickly 615 00:27:06,270 --> 00:27:08,190 because of that investment. 616 00:27:08,190 --> 00:27:10,050 It was actually-- I knew going in what 617 00:27:10,050 --> 00:27:11,340 I was going to be doing. 618 00:27:11,340 --> 00:27:13,980 I saw what John and other people got out 619 00:27:13,980 --> 00:27:16,972 of being in the Urban Challenge, including in especially 620 00:27:16,972 --> 00:27:19,180 the tools, like the LCM that we've been talking about 621 00:27:19,180 --> 00:27:20,340 and stuff today. 622 00:27:20,340 --> 00:27:22,500 And I wanted that for my group. 623 00:27:22,500 --> 00:27:24,480 So it was a very conscious decision. 624 00:27:24,480 --> 00:27:28,260 I'm at a place now that we can do fantastic research. 625 00:27:28,260 --> 00:27:32,130 Every one of the students involved in great research work 626 00:27:32,130 --> 00:27:33,887 on the project. 627 00:27:33,887 --> 00:27:35,970 We hired a few staff programmers to help with some 628 00:27:35,970 --> 00:27:38,410 of the non-research stuff. 629 00:27:38,410 --> 00:27:40,860 And I think the hardware is important. 630 00:27:40,860 --> 00:27:43,455 It's hard to balance, but I do think it's important. 631 00:27:43,455 --> 00:27:45,830 PATRICK WINSTON: Just one short follow-up question there. 632 00:27:45,830 --> 00:27:48,000 Earlier you said that some of your students 633 00:27:48,000 --> 00:27:49,430 didn't want to work on it. 634 00:27:49,430 --> 00:27:50,370 And why was that? 635 00:27:50,370 --> 00:27:53,899 Was that a principal reason? 636 00:27:53,899 --> 00:27:56,190 RUSS TEDRAKE: People knew how much soldering time there 637 00:27:56,190 --> 00:27:56,891 was going to be. 638 00:27:56,891 --> 00:27:57,390 Right? 639 00:27:57,390 --> 00:28:01,830 And the people that had their research agenda and it 640 00:28:01,830 --> 00:28:03,960 was more theoretical, and they didn't 641 00:28:03,960 --> 00:28:05,610 want that soldering time. 642 00:28:05,610 --> 00:28:08,070 Other people said, I'm still looking for ideas. 643 00:28:08,070 --> 00:28:10,230 This is going to motivate me for my future work. 644 00:28:10,230 --> 00:28:11,140 They jumped right in. 645 00:28:11,140 --> 00:28:14,537 And super strong students made different decisions on that. 646 00:28:14,537 --> 00:28:16,870 PATRICK WINSTON: And they both made the right decisions. 647 00:28:16,870 --> 00:28:17,360 RUSS TEDRAKE: I think so. 648 00:28:17,360 --> 00:28:17,875 PATRICK WINSTON: Yeah. 649 00:28:17,875 --> 00:28:18,250 RUSS TEDRAKE: Yeah. 650 00:28:18,250 --> 00:28:18,960 PATRICK WINSTON: But John, you've also 651 00:28:18,960 --> 00:28:21,120 been involved in-- well, the just driving car thing 652 00:28:21,120 --> 00:28:24,330 was a major DARPA grand challenge. 653 00:28:24,330 --> 00:28:26,790 Some people have been critical of these grand challenges 654 00:28:26,790 --> 00:28:29,460 because they say that, well, they 655 00:28:29,460 --> 00:28:31,290 drive the technology up the closest hill, 656 00:28:31,290 --> 00:28:33,207 but they don't get you on a different hill. 657 00:28:33,207 --> 00:28:35,040 Do you have any feelings about these things, 658 00:28:35,040 --> 00:28:37,740 if they're good idea in retrospect, 659 00:28:37,740 --> 00:28:39,360 having participated in them? 660 00:28:39,360 --> 00:28:40,360 JOHN LEONARD: Let's see. 661 00:28:40,360 --> 00:28:43,110 I'm really torn on that one because I 662 00:28:43,110 --> 00:28:46,152 see how the short term benefits of the community-- and you 663 00:28:46,152 --> 00:28:47,860 can point to things like the Google car-- 664 00:28:47,860 --> 00:28:50,150 that there's a clear impact. 665 00:28:50,150 --> 00:28:53,580 But DARPA does have a mindset that once they've 666 00:28:53,580 --> 00:28:55,750 done something, they declare victory and move on. 667 00:28:55,750 --> 00:28:58,740 So now if you work, say, on legged locomotion, which 668 00:28:58,740 --> 00:29:01,030 one of my junior colleagues does, 669 00:29:01,030 --> 00:29:02,590 DARPA won't answer his emails. 670 00:29:02,590 --> 00:29:04,910 It's like, OK, we did legged locomotion. 671 00:29:04,910 --> 00:29:07,530 And so I think that the challenge 672 00:29:07,530 --> 00:29:10,830 is to be mindful of where we are in terms 673 00:29:10,830 --> 00:29:13,300 of the real long-term progress. 674 00:29:13,300 --> 00:29:15,810 And it's not an easy conversation 675 00:29:15,810 --> 00:29:18,745 to have with the funding agencies, but-- 676 00:29:18,745 --> 00:29:20,370 PATRICK WINSTON: But what about a brand 677 00:29:20,370 --> 00:29:22,140 new way of doing something that is not 678 00:29:22,140 --> 00:29:24,990 going to be competitive in terms of demonstration for a while? 679 00:29:27,910 --> 00:29:30,540 Is that a problem that's amplified 680 00:29:30,540 --> 00:29:32,040 by these DARPA grand challenges? 681 00:29:32,040 --> 00:29:33,670 I mean, take chess, for example. 682 00:29:33,670 --> 00:29:36,900 If you had a great idea about how humans play chess, 683 00:29:36,900 --> 00:29:40,220 you would never be competitive with Deep Blue, 684 00:29:40,220 --> 00:29:41,680 or not for a long time. 685 00:29:41,680 --> 00:29:43,830 So you wouldn't be in a DARPA program 686 00:29:43,830 --> 00:29:47,580 that was aimed at doing chess. 687 00:29:47,580 --> 00:29:48,777 So is that a-- 688 00:29:48,777 --> 00:29:49,985 do you see that as a problem? 689 00:29:52,821 --> 00:29:54,570 RUSS TEDRAKE: I think it's a huge problem. 690 00:29:54,570 --> 00:29:56,610 But I still see a role for these kind 691 00:29:56,610 --> 00:29:59,880 of competitions as benchmarks. 692 00:29:59,880 --> 00:30:01,860 And I wouldn't do another one today. 693 00:30:01,860 --> 00:30:04,890 I mean, for me it was the right time 694 00:30:04,890 --> 00:30:07,900 to sort of see how our theory had gotten, 695 00:30:07,900 --> 00:30:10,860 try it on a much more complicated robot, benchmark 696 00:30:10,860 --> 00:30:12,820 where we are, get some new ideas going forward. 697 00:30:12,820 --> 00:30:15,540 It was perfect for me. 698 00:30:15,540 --> 00:30:17,190 But you can't set a research agenda. 699 00:30:17,190 --> 00:30:18,290 JOHN LEONARD: And they're dangerous for students. 700 00:30:18,290 --> 00:30:20,190 So one of our strongest students never 701 00:30:20,190 --> 00:30:24,180 got his PhD, because his wife is in program PhD 702 00:30:24,180 --> 00:30:26,970 program in biology and he did the DARPA challenge. 703 00:30:26,970 --> 00:30:28,376 And she finished her thesis. 704 00:30:28,376 --> 00:30:30,750 And he said, I don't want to live alone on the east coast 705 00:30:30,750 --> 00:30:33,010 while she starts her faculty position in California. 706 00:30:33,010 --> 00:30:34,155 So I'm out of here. 707 00:30:34,155 --> 00:30:37,446 And that's the sort of thing. 708 00:30:37,446 --> 00:30:39,570 STEFANIE TELLEX: I kind of made different decisions 709 00:30:39,570 --> 00:30:40,980 about that over my career. 710 00:30:40,980 --> 00:30:44,790 So when I was a post-doc at MIT, I really, 711 00:30:44,790 --> 00:30:47,094 really, really worked to avoid soldering time. 712 00:30:47,094 --> 00:30:47,760 I was fortunate. 713 00:30:47,760 --> 00:30:48,974 I kind of walked around. 714 00:30:48,974 --> 00:30:50,890 There's all these great robot robotic systems. 715 00:30:50,890 --> 00:30:52,910 And I would bolt language on, and get one paper. 716 00:30:52,910 --> 00:30:55,380 And bolt language on another way and get another paper. 717 00:30:55,380 --> 00:30:58,340 And you look to get a faculty position, 718 00:30:58,340 --> 00:31:00,990 you have to have this focused research agenda. 719 00:31:00,990 --> 00:31:03,200 And so I was focused on that. 720 00:31:03,200 --> 00:31:03,880 And it worked. 721 00:31:03,880 --> 00:31:06,180 I think it was a very productive time for me. 722 00:31:06,180 --> 00:31:08,700 But I really valued the past two years at Brown, 723 00:31:08,700 --> 00:31:12,610 where there's not as much other roboticists around there. 724 00:31:12,610 --> 00:31:16,560 So I've really been forced to broaden myself as a roboticist 725 00:31:16,560 --> 00:31:19,230 and spend a lot more time soldering, 726 00:31:19,230 --> 00:31:23,420 making this system for pick and place on Baxter. 727 00:31:23,420 --> 00:31:24,920 The first year I didn't hack at all, 728 00:31:24,920 --> 00:31:26,040 and the second year I started hacking 729 00:31:26,040 --> 00:31:27,440 on that system, the one that was doing 730 00:31:27,440 --> 00:31:28,648 the grasping with my student. 731 00:31:28,648 --> 00:31:30,570 It was the best decision I ever made. 732 00:31:30,570 --> 00:31:33,850 I learned so much about the abstractions. 733 00:31:33,850 --> 00:31:36,210 Because the problems that we needed 734 00:31:36,210 --> 00:31:39,210 to solve at the beginning, before I started hacking, 735 00:31:39,210 --> 00:31:40,562 I just didn't understand. 736 00:31:40,562 --> 00:31:42,270 The problems I thought we needed to solve 737 00:31:42,270 --> 00:31:43,853 were not the problems that we actually 738 00:31:43,853 --> 00:31:46,326 needed to solve to make the robot do something useful. 739 00:31:46,326 --> 00:31:47,700 And I don't think there's any way 740 00:31:47,700 --> 00:31:51,180 we could have gotten to that knowledge without hacking 741 00:31:51,180 --> 00:31:54,090 and trying to build it. 742 00:31:54,090 --> 00:31:57,030 GIORGIO METTA: In our case, we've 743 00:31:57,030 --> 00:32:00,120 been lucky, in a sense, that we had 744 00:32:00,120 --> 00:32:07,230 resources in terms of engineers that could do the soldering. 745 00:32:07,230 --> 00:32:10,500 So at the moment we still have about 25 people that 746 00:32:10,500 --> 00:32:12,930 are just doing the soldering. 747 00:32:12,930 --> 00:32:15,070 So it's just a large number. 748 00:32:15,070 --> 00:32:15,730 Yeah. 749 00:32:15,730 --> 00:32:18,540 PATRICK WINSTON: That would look like a battalion, something 750 00:32:18,540 --> 00:32:19,363 like that. 751 00:32:19,363 --> 00:32:21,430 JOHN LEONARD: Can I say something more generally? 752 00:32:21,430 --> 00:32:26,190 So there's a lot of claims in the media and sort 753 00:32:26,190 --> 00:32:29,350 of hyped fears about robots that take over the world, 754 00:32:29,350 --> 00:32:31,320 or sort of very strong AI. 755 00:32:31,320 --> 00:32:34,020 And partly, they sometimes they point to Moore's law 756 00:32:34,020 --> 00:32:36,390 as this evidence of great progress. 757 00:32:36,390 --> 00:32:39,570 But I would say that in robotics we're 758 00:32:39,570 --> 00:32:43,380 lacking high-performance commodity robot 759 00:32:43,380 --> 00:32:46,466 hardware that lets us make tremendous progress. 760 00:32:46,466 --> 00:32:48,840 And so things like Baxter are great because they're cheap 761 00:32:48,840 --> 00:32:51,131 and they're safe, and they're a step in that direction. 762 00:32:51,131 --> 00:32:54,030 But I think we're going to look back 20 years from now 763 00:32:54,030 --> 00:32:57,870 and say, how did we make any progress with the robots we 764 00:32:57,870 --> 00:32:58,680 had at the time? 765 00:32:58,680 --> 00:33:02,790 Like, we really need better robots that just get massively 766 00:33:02,790 --> 00:33:04,210 out there in the labs. 767 00:33:04,210 --> 00:33:05,414 RUSS TEDRAKE: But-- 768 00:33:05,414 --> 00:33:07,080 TONY PRESCOTT: I was going to echo that, 769 00:33:07,080 --> 00:33:12,501 because I think robotics is massively interdisciplinary. 770 00:33:12,501 --> 00:33:15,840 And you've maybe got people more towards control here slightly. 771 00:33:18,946 --> 00:33:20,820 What we're trying to do in Sheffield robotics 772 00:33:20,820 --> 00:33:23,670 is actually bringing in more of the other disciplines 773 00:33:23,670 --> 00:33:26,760 in engineering, but also science, social science. 774 00:33:26,760 --> 00:33:29,710 Everybody has a potential contribution to make. 775 00:33:29,710 --> 00:33:31,680 Certainly electronic engineering, 776 00:33:31,680 --> 00:33:33,600 mechanical engineering. 777 00:33:33,600 --> 00:33:37,800 Soft robotics, I think, depends very much on new materials, 778 00:33:37,800 --> 00:33:39,290 materials science. 779 00:33:39,290 --> 00:33:41,970 And then these things have different control challenges. 780 00:33:41,970 --> 00:33:44,580 But sometimes the control problem 781 00:33:44,580 --> 00:33:48,540 is really simplified if you have the right material substrates. 782 00:33:48,540 --> 00:33:51,060 So if you can solve Giorgio's problem, having 783 00:33:51,060 --> 00:33:54,605 a powerful actuator, then his problem of building iCub 784 00:33:54,605 --> 00:33:55,890 is much simplified. 785 00:33:55,890 --> 00:33:58,200 So I think we have to think of robotics 786 00:33:58,200 --> 00:34:01,190 as this large, multi-disciplinary enterprise. 787 00:34:01,190 --> 00:34:05,040 And if we're going to build robots that are useful, 788 00:34:05,040 --> 00:34:06,944 you have to pull in all this expertise. 789 00:34:06,944 --> 00:34:08,819 And we're interested in pulling the expertise 790 00:34:08,819 --> 00:34:11,760 in from social science as well. 791 00:34:11,760 --> 00:34:13,775 Because I think one of the major problems 792 00:34:13,775 --> 00:34:16,710 that we will face in AI and in robotics 793 00:34:16,710 --> 00:34:20,350 is kind of backlash, which is already happening. 794 00:34:20,350 --> 00:34:22,070 Do we really want these machines? 795 00:34:22,070 --> 00:34:24,300 And how are they going to change the world? 796 00:34:24,300 --> 00:34:26,425 Understanding what the impacts will be 797 00:34:26,425 --> 00:34:29,520 and trying to build in safeguards 798 00:34:29,520 --> 00:34:32,471 against the negative impact is something we should work on. 799 00:34:32,471 --> 00:34:34,679 PATRICK WINSTON: But Giorgio had on one of the slides 800 00:34:34,679 --> 00:34:39,630 that one of the reasons for doing all this was fun. 801 00:34:39,630 --> 00:34:42,000 And I wonder to what degree that is the motivation? 802 00:34:42,000 --> 00:34:45,580 Because all of you talked about how difficult the problems are, 803 00:34:45,580 --> 00:34:48,420 and some of them are like the ones you talked about, 804 00:34:48,420 --> 00:34:50,579 John, watching that policeman say, 805 00:34:50,579 --> 00:34:51,870 go ahead through the red light. 806 00:34:51,870 --> 00:34:56,880 Those seem not insurmountable, but very tough and sound 807 00:34:56,880 --> 00:34:59,350 like they would take five decades. 808 00:34:59,350 --> 00:35:02,917 So is the motivation largely that it's fun? 809 00:35:02,917 --> 00:35:04,500 RUSS TEDRAKE: That's a big part of it. 810 00:35:04,500 --> 00:35:07,920 I mean, so we've done some work on steady aerodynamics 811 00:35:07,920 --> 00:35:12,300 and the like, too, and I like-- so we made robotic birds. 812 00:35:12,300 --> 00:35:15,150 And I tried to make robotic birds like on a perch. 813 00:35:15,150 --> 00:35:18,180 And then we had a small side project 814 00:35:18,180 --> 00:35:21,450 where we tried to show that the exact same technology could 815 00:35:21,450 --> 00:35:23,126 help a wind turbine be more efficient. 816 00:35:23,126 --> 00:35:24,000 PATRICK WINSTON: Yeah 817 00:35:24,000 --> 00:35:26,860 RUSS TEDRAKE: And that's the important problem. 818 00:35:26,860 --> 00:35:30,612 I could have easily started off and done some of the same work 819 00:35:30,612 --> 00:35:33,070 by saying I was going to make wind turbines more efficient. 820 00:35:33,070 --> 00:35:34,528 I was going to study pitch control. 821 00:35:34,528 --> 00:35:37,420 I'd be very serious about that. 822 00:35:37,420 --> 00:35:38,920 But I did it the other way around. 823 00:35:38,920 --> 00:35:40,806 I wanted to try to make a robot bird. 824 00:35:40,806 --> 00:35:42,180 And I think the win-- not only do 825 00:35:42,180 --> 00:35:45,989 I get excited going in, try to make a bird 826 00:35:45,989 --> 00:35:48,030 fly for the first time instead of getting 2% more 827 00:35:48,030 --> 00:35:49,590 efficient on a wind turbine. 828 00:35:49,590 --> 00:35:52,286 I mean, I go in more excited, but I also-- 829 00:35:52,286 --> 00:35:53,910 I get to recruit the very best students 830 00:35:53,910 --> 00:35:56,120 in the world because of it. 831 00:35:56,120 --> 00:35:59,780 There's just so many good reasons to do that. 832 00:35:59,780 --> 00:36:02,280 Sometimes it makes me feel a little shallow because the wind 833 00:36:02,280 --> 00:36:05,670 turbine's way more important than a robotic bird. 834 00:36:05,670 --> 00:36:09,274 But that is-- the fun is the choice. 835 00:36:09,274 --> 00:36:10,940 PATRICK WINSTON: What about it, Giorgio? 836 00:36:10,940 --> 00:36:11,670 Do you do it-- 837 00:36:11,670 --> 00:36:14,340 or you have a huge group there. 838 00:36:14,340 --> 00:36:17,790 Somebody must be paid for all those people. 839 00:36:17,790 --> 00:36:23,005 Are they in expectation of applications in the near term? 840 00:36:23,005 --> 00:36:23,880 GIORGIO METTA: Sorry. 841 00:36:23,880 --> 00:36:25,921 PATRICK WINSTON: You have a huge group of people. 842 00:36:25,921 --> 00:36:27,030 GIORGIO METTA: Yeah. 843 00:36:27,030 --> 00:36:31,050 I mean, the group is mainly funded internally 844 00:36:31,050 --> 00:36:39,440 by IIT, which is public funding for large groups basically. 845 00:36:39,440 --> 00:36:45,240 And actually, the robotics program at IIT is even larger-- 846 00:36:45,240 --> 00:36:47,790 the group on the iCub is actually-- 847 00:36:47,790 --> 00:36:51,200 there are four PIs working on it, 848 00:36:51,200 --> 00:36:53,970 and collaborations with other people 849 00:36:53,970 --> 00:36:58,370 like with the IIT-MIT group also. 850 00:36:58,370 --> 00:37:03,710 But the overall robotics program at IIT's about 250 people, 851 00:37:03,710 --> 00:37:04,710 I would say. 852 00:37:04,710 --> 00:37:09,120 So they're certainly also part of the reason 853 00:37:09,120 --> 00:37:13,785 why we've been able to go for a complicated platform. 854 00:37:13,785 --> 00:37:17,330 There was one that actually participated in the DARPA 855 00:37:17,330 --> 00:37:18,950 robotics challenge. 856 00:37:18,950 --> 00:37:24,000 There's people doing quadrupeds and people doing robotics 857 00:37:24,000 --> 00:37:25,257 for rehabilitation. 858 00:37:25,257 --> 00:37:26,340 So there's various things. 859 00:37:26,340 --> 00:37:28,673 PATRICK WINSTON: So there must be princes and princesses 860 00:37:28,673 --> 00:37:31,680 of science back there somewhere who view this as a long-term 861 00:37:31,680 --> 00:37:33,330 investment that will have some-- 862 00:37:33,330 --> 00:37:35,288 GIORGIO METTA: It was in the scientific program 863 00:37:35,288 --> 00:37:36,970 in the Institute to invest in robotics. 864 00:37:36,970 --> 00:37:41,670 And while one day they may be looking at the results 865 00:37:41,670 --> 00:37:45,020 and see whether we've done a good job, 866 00:37:45,020 --> 00:37:48,140 desire to fire us all, whatever. 867 00:37:48,140 --> 00:37:51,790 Hey man, that might be the case. 868 00:37:51,790 --> 00:37:54,930 And I think it was-- 869 00:37:54,930 --> 00:37:58,770 unexpectedly IIT started in 2006. 870 00:37:58,770 --> 00:38:00,900 And the centrifuge program include 871 00:38:00,900 --> 00:38:05,550 robotics and all the hype about robotics 872 00:38:05,550 --> 00:38:09,180 that started in recent years, Google acquiring companies, 873 00:38:09,180 --> 00:38:13,290 this and that, I think in hindsight 874 00:38:13,290 --> 00:38:17,257 has been a good choice to be in robotics at that time. 875 00:38:17,257 --> 00:38:18,465 Just by sheer luck, probably. 876 00:38:21,174 --> 00:38:22,590 RUSS TEDRAKE: To be clear, I think 877 00:38:22,590 --> 00:38:25,290 we're having fun but solving all the right problems while-- 878 00:38:25,290 --> 00:38:26,510 I think we just sort of-- 879 00:38:26,510 --> 00:38:27,160 yeah. 880 00:38:27,160 --> 00:38:28,620 We lucked out, maybe, a little bit. 881 00:38:28,620 --> 00:38:31,120 But we found a way to have fun and solve the right problems. 882 00:38:31,120 --> 00:38:32,530 So I don't feel that we're-- 883 00:38:32,530 --> 00:38:34,530 GIORGIO METTA: I think it's a combination of fun 884 00:38:34,530 --> 00:38:40,050 and the challenge, so not solving trivial things 885 00:38:40,050 --> 00:38:43,320 just because it's fun, but a combination of the two, 886 00:38:43,320 --> 00:38:45,975 seeing something as an unsolved problem. 887 00:38:45,975 --> 00:38:47,850 STEFANIE TELLEX: So I try really hard to only 888 00:38:47,850 --> 00:38:49,560 work on things that are fun and to spend 889 00:38:49,560 --> 00:38:51,909 as little time as possible on things that are not fun. 890 00:38:51,909 --> 00:38:53,700 And I don't think of it as a shallow thing. 891 00:38:53,700 --> 00:38:55,991 I think of it as a kind of resource optimization thing, 892 00:38:55,991 --> 00:38:58,140 because I'm about 1,000 times more productive when 893 00:38:58,140 --> 00:39:00,780 I'm having fun than when I'm not having fun. 894 00:39:00,780 --> 00:39:04,320 So even if it was more serious or something, 895 00:39:04,320 --> 00:39:07,260 I would get so much less done that it's just not worth it. 896 00:39:07,260 --> 00:39:11,430 It's better to do the fun thing and work the long hours 897 00:39:11,430 --> 00:39:13,170 because it's fun. 898 00:39:13,170 --> 00:39:16,590 So for me it's just-- it's still obviously the right thing 899 00:39:16,590 --> 00:39:20,530 because so much more gets done that way, for me. 900 00:39:20,530 --> 00:39:23,430 PATRICK WINSTON: Well, to put another twist on this, 901 00:39:23,430 --> 00:39:27,090 if you were a DARPA program manager, 902 00:39:27,090 --> 00:39:35,270 what would you do for the next round of progress in robotics? 903 00:39:35,270 --> 00:39:37,490 Do have a sense of what ought to be next? 904 00:39:37,490 --> 00:39:40,910 Or maybe what the flaws in previous programs have been? 905 00:39:44,786 --> 00:39:47,160 STEFANIE TELLEX: So we've been talking to a DARPA program 906 00:39:47,160 --> 00:39:49,140 manager about what they should do next. 907 00:39:49,140 --> 00:39:51,420 And we got a seedling for a program 908 00:39:51,420 --> 00:39:54,900 to think about planning in really large state action 909 00:39:54,900 --> 00:39:58,242 spaces to enable-- 910 00:39:58,242 --> 00:39:59,700 the sort of middle part of my talk, 911 00:39:59,700 --> 00:40:01,616 we were talking about the dime problem, right? 912 00:40:01,616 --> 00:40:04,830 So we wanted a planner that could find actions 913 00:40:04,830 --> 00:40:09,090 like picking up a like small scale actions, 914 00:40:09,090 --> 00:40:11,460 but also large scale things like unload the truck 915 00:40:11,460 --> 00:40:13,270 or clean up the warehouse. 916 00:40:13,270 --> 00:40:14,220 And so we were-- 917 00:40:14,220 --> 00:40:15,780 because we thought that is what's 918 00:40:15,780 --> 00:40:17,730 needed to interpret natural language commands 919 00:40:17,730 --> 00:40:20,870 and interact with a person at the level of abstraction. 920 00:40:20,870 --> 00:40:23,560 So we have a seedling to work on that. 921 00:40:23,560 --> 00:40:25,770 JOHN LEONARD: So if I could clone myself, 922 00:40:25,770 --> 00:40:27,600 say I made four or five of my selves. 923 00:40:27,600 --> 00:40:30,660 One of them I would, if I were DARPA program manager, 924 00:40:30,660 --> 00:40:32,740 to do Google for the physical world. 925 00:40:32,740 --> 00:40:35,190 So think about having like an object-based understanding 926 00:40:35,190 --> 00:40:39,060 of things and people in the environment and places, 927 00:40:39,060 --> 00:40:43,080 and being able to do the equivalent of internet search, 928 00:40:43,080 --> 00:40:46,440 physical world search, just combining perception 929 00:40:46,440 --> 00:40:48,280 and then being able to go get objects. 930 00:40:48,280 --> 00:40:51,510 So the physical-- like, w get for the physical world. 931 00:40:51,510 --> 00:40:56,350 That's what I would like to do. 932 00:40:56,350 --> 00:40:58,110 TONY PRESCOTT: In the UK, I think-- 933 00:40:58,110 --> 00:40:59,820 so I don't know about DARPA. 934 00:40:59,820 --> 00:41:02,380 But the government made it one of their eight 935 00:41:02,380 --> 00:41:04,230 great technologies a few years ago, 936 00:41:04,230 --> 00:41:06,470 robotics and autonomous systems. 937 00:41:06,470 --> 00:41:09,845 And looking again, now, at the priorities and they're now 938 00:41:09,845 --> 00:41:11,970 looking, well, what are the disruptive technologies 939 00:41:11,970 --> 00:41:15,720 in robotics, again, is coming out as one of the things 940 00:41:15,720 --> 00:41:17,230 that they think is important. 941 00:41:17,230 --> 00:41:21,160 So in terms of potential economic and societal impact, 942 00:41:21,160 --> 00:41:22,590 I think it's huge. 943 00:41:22,590 --> 00:41:25,500 And so if US funding agencies aren't doing it-- 944 00:41:25,500 --> 00:41:29,640 PATRICK WINSTON: What do you see those applications as being? 945 00:41:29,640 --> 00:41:31,110 TONY PRESCOTT: I think-- 946 00:41:31,110 --> 00:41:35,500 well, the big one that interests me is assistive technology. 947 00:41:35,500 --> 00:41:37,620 In Europe, Japan, I think the US, 948 00:41:37,620 --> 00:41:40,140 we're faced with aging society issues. 949 00:41:40,140 --> 00:41:43,752 And I think assistive robotics in all sorts of ways 950 00:41:43,752 --> 00:41:44,460 are going to be-- 951 00:41:44,460 --> 00:41:46,830 PATRICK WINSTON: So you mean for home health care 952 00:41:46,830 --> 00:41:48,030 type of applications? 953 00:41:48,030 --> 00:41:50,010 TONY PRESCOTT: Home health care-- 954 00:41:50,010 --> 00:41:52,880 prosthetics is already a massive growth area. 955 00:41:52,880 --> 00:41:57,310 But robots-- I mean, my generation, 956 00:41:57,310 --> 00:42:01,800 I've looked at the statistics and the number of people that 957 00:42:01,800 --> 00:42:05,640 are going to be in the age group 80 plus 958 00:42:05,640 --> 00:42:10,410 is going to be 50% higher when I reach that age. 959 00:42:10,410 --> 00:42:14,320 And it's a huge burden on younger people to care for us. 960 00:42:14,320 --> 00:42:18,420 So I think independence in my own age, I think in my old age 961 00:42:18,420 --> 00:42:21,122 I would love to be supported by technology. 962 00:42:21,122 --> 00:42:22,830 You can do what you like with a computer, 963 00:42:22,830 --> 00:42:24,905 but you can't physically help somebody. 964 00:42:24,905 --> 00:42:27,080 And that's where robots are different. 965 00:42:27,080 --> 00:42:29,880 So that would be one of the things that excites me, 966 00:42:29,880 --> 00:42:33,690 and one of the reasons I'm interested in applications. 967 00:42:33,690 --> 00:42:35,940 I'm driven, I think, by the excitement of the research 968 00:42:35,940 --> 00:42:37,160 and building stuff. 969 00:42:37,160 --> 00:42:40,260 But I'm also motivated by the potential benefits 970 00:42:40,260 --> 00:42:42,077 of the applications we can make. 971 00:42:42,077 --> 00:42:44,160 PATRICK WINSTON: I suppose if they're good enough, 972 00:42:44,160 --> 00:42:45,900 we won't need dishwashers because they 973 00:42:45,900 --> 00:42:48,221 can do the dishes themselves. 974 00:42:48,221 --> 00:42:48,720 OK. 975 00:42:48,720 --> 00:42:51,640 So now we have a question from the audience, 976 00:42:51,640 --> 00:42:54,570 which if I may paraphrase, there have been a-- 977 00:42:57,930 --> 00:43:00,060 have there been examples-- 978 00:43:00,060 --> 00:43:02,190 I know you think of them all the time, Tony. 979 00:43:02,190 --> 00:43:03,810 But have there been examples where 980 00:43:03,810 --> 00:43:07,260 work on robotics in your respective activities 981 00:43:07,260 --> 00:43:11,340 have shed new light on a biological problem 982 00:43:11,340 --> 00:43:13,290 or inspired a biological inquiry that 983 00:43:13,290 --> 00:43:16,680 wouldn't have happened without the kind of stuff you do? 984 00:43:16,680 --> 00:43:19,229 RUSS TEDRAKE: I started off more as a biologist, I guess. 985 00:43:19,229 --> 00:43:20,770 I was in a computational neuroscience 986 00:43:20,770 --> 00:43:22,260 lab with Sebastian Seung. 987 00:43:22,260 --> 00:43:24,700 I tried to study a lot about how the brain works, 988 00:43:24,700 --> 00:43:27,270 how the motor system works, in the hopes 989 00:43:27,270 --> 00:43:29,310 that it would help me make better robots. 990 00:43:29,310 --> 00:43:30,935 PATRICK WINSTON: Oh, maybe pause there. 991 00:43:30,935 --> 00:43:31,620 Did it? 992 00:43:31,620 --> 00:43:33,150 RUSS TEDRAKE: Yeah, it didn't. 993 00:43:33,150 --> 00:43:35,460 So I don't use that stuff right now. 994 00:43:35,460 --> 00:43:37,360 I mean, maybe one day again. 995 00:43:37,360 --> 00:43:40,930 But our hardware is very different, 996 00:43:40,930 --> 00:43:44,694 our computational hardware is very different right now. 997 00:43:44,694 --> 00:43:47,110 I think there's sort of a race to understand intelligence, 998 00:43:47,110 --> 00:43:49,170 and maybe we'll converge again right now. 999 00:43:49,170 --> 00:43:52,650 But the things I write down for the robots today 1000 00:43:52,650 --> 00:43:56,270 don't look anything like what I think the brain-- 1001 00:43:56,270 --> 00:44:01,640 what I was learning about what the brain did back then. 1002 00:44:01,640 --> 00:44:05,250 But that doesn't mean there's not tons of cross-pollination. 1003 00:44:05,250 --> 00:44:10,250 So we have a great project with a biologist at Harvard, Andy 1004 00:44:10,250 --> 00:44:12,290 Biewener. 1005 00:44:12,290 --> 00:44:15,800 Andy has been studying maneuvering flight in birds. 1006 00:44:15,800 --> 00:44:19,110 He's instrumenting birds flying through dense obstacles. 1007 00:44:19,110 --> 00:44:21,690 We're trying to make UAVs fly through dense obstacles. 1008 00:44:21,690 --> 00:44:24,649 We're exchanging capabilities and ideas 1009 00:44:24,649 --> 00:44:25,690 and going back and forth. 1010 00:44:25,690 --> 00:44:27,630 Just the algorithms that we have written 1011 00:44:27,630 --> 00:44:30,610 helps him understand what birds are doing and vice versa. 1012 00:44:30,610 --> 00:44:32,560 So there's tons of exchanges. 1013 00:44:32,560 --> 00:44:35,340 But the code that I write to power of the robots today, 1014 00:44:35,340 --> 00:44:38,800 I think, is not quite what the brain is doing, 1015 00:44:38,800 --> 00:44:41,100 and nor should it be. 1016 00:44:41,100 --> 00:44:44,110 PATRICK WINSTON: Any other thoughts on that? 1017 00:44:44,110 --> 00:44:47,350 GIORGIO METTA: Well, we have experiments 1018 00:44:47,350 --> 00:44:50,560 that I meant to actually present today, 1019 00:44:50,560 --> 00:44:56,620 where we've been working with neuroscientists on trying 1020 00:44:56,620 --> 00:45:01,770 to bring some of the principles from neuroscience to the robot 1021 00:45:01,770 --> 00:45:02,640 construction. 1022 00:45:02,640 --> 00:45:06,700 Let's say, not the physical robot, but the software. 1023 00:45:06,700 --> 00:45:15,260 And I find always difficult to find the level of abstraction 1024 00:45:15,260 --> 00:45:19,550 that actually motivate something from neuroscience 1025 00:45:19,550 --> 00:45:22,980 and manages to show something important for computation. 1026 00:45:22,980 --> 00:45:29,670 I think I only have one example, or two maybe overall. 1027 00:45:29,670 --> 00:45:35,520 And it always happen not copying in details brain structure, 1028 00:45:35,520 --> 00:45:38,120 but just taking an idea what information 1029 00:45:38,120 --> 00:45:40,550 may be relevant for a certain task 1030 00:45:40,550 --> 00:45:42,290 and trying to figure out solutions 1031 00:45:42,290 --> 00:45:44,690 that use that information. 1032 00:45:44,690 --> 00:45:48,880 In particular, a couple of things we've done 1033 00:45:48,880 --> 00:45:55,010 had to do with the involvement of motor controlling 1034 00:45:55,010 --> 00:45:57,140 information in perception. 1035 00:45:57,140 --> 00:46:00,570 And that's something the sort of paid off, 1036 00:46:00,570 --> 00:46:04,280 at least in the smallest experiments. 1037 00:46:04,280 --> 00:46:08,540 Still, we can't compare with full-blown systems. 1038 00:46:08,540 --> 00:46:11,320 Like we've done experiments in speech perception 1039 00:46:11,320 --> 00:46:15,470 and that show to be over-performing systems 1040 00:46:15,470 --> 00:46:17,510 that don't use motor information, 1041 00:46:17,510 --> 00:46:19,430 but on limited settings. 1042 00:46:19,430 --> 00:46:22,490 We don't know if we build the full speech recognition 1043 00:46:22,490 --> 00:46:25,880 system what happens, whether we better or worse existing 1044 00:46:25,880 --> 00:46:30,200 commercial methods or commercial systems. 1045 00:46:30,200 --> 00:46:35,300 So it's still a long way to actually show 1046 00:46:35,300 --> 00:46:41,580 that we managed to get something from the biological 1047 00:46:41,580 --> 00:46:42,940 counter-part. 1048 00:46:42,940 --> 00:46:45,220 Although maybe for the neuroscientists 1049 00:46:45,220 --> 00:46:46,510 this explains something. 1050 00:46:46,510 --> 00:46:51,760 Because where they didn't have a specific theory, 1051 00:46:51,760 --> 00:46:53,290 at least we showed the advantages 1052 00:46:53,290 --> 00:46:54,940 of that particular solution, that it's 1053 00:46:54,940 --> 00:46:58,780 being used by the brain. 1054 00:46:58,780 --> 00:47:00,570 TONY PRESCOTT: So I think there's-- 1055 00:47:00,570 --> 00:47:04,830 we tend to forget in our history where our ideas came from. 1056 00:47:04,830 --> 00:47:08,460 So for instance, reinforcement learning-- 1057 00:47:08,460 --> 00:47:10,290 Demis Hassabis explained last night 1058 00:47:10,290 --> 00:47:13,740 how he's using this to play Atari computer 1059 00:47:13,740 --> 00:47:16,530 games and these amazing system he's developing. 1060 00:47:16,530 --> 00:47:19,560 If you go back in the history of reinforcement learning, 1061 00:47:19,560 --> 00:47:23,400 the key idea there came from two psychologists, 1062 00:47:23,400 --> 00:47:25,680 Rescorla Wagner developing a theory 1063 00:47:25,680 --> 00:47:27,470 of classical conditioning. 1064 00:47:27,470 --> 00:47:34,170 And then that got picked up in machine learning in the 1980s, 1065 00:47:34,170 --> 00:47:40,120 and it got really developed and hugely accelerated. 1066 00:47:40,120 --> 00:47:42,900 But then there was crossover back into neuroscience 1067 00:47:42,900 --> 00:47:44,900 with dopamine theory and so on. 1068 00:47:44,900 --> 00:47:47,880 And ideas about hierarchical reinforcement learning 1069 00:47:47,880 --> 00:47:51,600 have been developed that are partly brain inspired. 1070 00:47:51,600 --> 00:47:53,630 So I think it's-- 1071 00:47:53,630 --> 00:47:56,370 there is crossover, and sometimes we 1072 00:47:56,370 --> 00:47:58,764 may lose track of how much crossover there is. 1073 00:47:58,764 --> 00:48:01,180 PATRICK WINSTON: I have another comment from the audience, 1074 00:48:01,180 --> 00:48:03,870 I see we are under some pressure to not drone on 1075 00:48:03,870 --> 00:48:05,530 for the rest of the evening. 1076 00:48:05,530 --> 00:48:10,170 So the comment is I think relevant to the last topic I 1077 00:48:10,170 --> 00:48:13,980 wanted to bring up, which is the question of ethics 1078 00:48:13,980 --> 00:48:14,670 in all of this. 1079 00:48:14,670 --> 00:48:19,260 And the comment is why should we make 1080 00:48:19,260 --> 00:48:20,939 robots that are good at operating, doing 1081 00:48:20,939 --> 00:48:22,980 things in the household, take care of the elderly 1082 00:48:22,980 --> 00:48:26,140 and so on, when the rest of AI is going hell-bent 1083 00:48:26,140 --> 00:48:28,570 to put a lot of people out of work and people 1084 00:48:28,570 --> 00:48:30,910 who could perhaps use those jobs. 1085 00:48:30,910 --> 00:48:35,700 But in any event, there's been a lot of concern, 1086 00:48:35,700 --> 00:48:39,420 perhaps spawned by some of the films like Ex Machina 1087 00:48:39,420 --> 00:48:42,140 and so on, that robots will take over. 1088 00:48:42,140 --> 00:48:44,820 And I don't think are going to take over in that sense 1089 00:48:44,820 --> 00:48:45,380 very soon. 1090 00:48:45,380 --> 00:48:46,570 But do you see-- 1091 00:48:46,570 --> 00:48:49,470 do you worry-- do you think about any dangers of the kinds 1092 00:48:49,470 --> 00:48:51,210 of technology you're working on in terms 1093 00:48:51,210 --> 00:48:54,522 of economic dislocation or battlefield robots 1094 00:48:54,522 --> 00:48:55,980 or anything of that sort that might 1095 00:48:55,980 --> 00:48:59,807 come about as a consequence of what you do? 1096 00:48:59,807 --> 00:49:01,390 RUSS TEDRAKE: I think it's inevitable. 1097 00:49:01,390 --> 00:49:03,870 I think we shouldn't fear it, but we 1098 00:49:03,870 --> 00:49:05,860 have to be conscious of it. 1099 00:49:05,860 --> 00:49:08,640 So I mean, would you look back to the 1980s 1100 00:49:08,640 --> 00:49:12,300 and avoid the invent of the personal computer 1101 00:49:12,300 --> 00:49:15,030 because it was going to change the way people had to do work? 1102 00:49:15,030 --> 00:49:17,160 I mean, of course you wouldn't. 1103 00:49:17,160 --> 00:49:19,934 But at the same time that changed the way 1104 00:49:19,934 --> 00:49:20,850 people had to do work. 1105 00:49:20,850 --> 00:49:25,015 And it was painful for a big portion of the population, 1106 00:49:25,015 --> 00:49:26,640 but ultimately it was good for society. 1107 00:49:26,640 --> 00:49:29,990 I think robots will have the same sort of effect. 1108 00:49:29,990 --> 00:49:32,520 It's going to raise the bar on what 1109 00:49:32,520 --> 00:49:33,800 people are capable of doing. 1110 00:49:33,800 --> 00:49:36,540 It's going to raise the bar on what people have 1111 00:49:36,540 --> 00:49:38,790 to be successful in their jobs. 1112 00:49:38,790 --> 00:49:41,130 And it might be painful, but I think 1113 00:49:41,130 --> 00:49:45,350 it's super important for society to keep moving on it. 1114 00:49:45,350 --> 00:49:47,641 PATRICK WINSTON: Why, again, is it super important? 1115 00:49:47,641 --> 00:49:49,640 RUSS TEDRAKE: Because it's going to advance what 1116 00:49:49,640 --> 00:49:50,890 we're capable of as a society. 1117 00:49:50,890 --> 00:49:54,352 It's going to make us ultimately more productive. 1118 00:49:54,352 --> 00:49:56,392 PATRICK WINSTON: Other thoughts? 1119 00:49:56,392 --> 00:49:57,350 TONY PRESCOTT: I agree. 1120 00:49:57,350 --> 00:50:00,560 I mean, I think the people that are worrying about jobs being 1121 00:50:00,560 --> 00:50:02,460 taken by robots aren't the people that 1122 00:50:02,460 --> 00:50:03,980 want to do those jobs. 1123 00:50:03,980 --> 00:50:06,590 Because most of the jobs are ones that it's 1124 00:50:06,590 --> 00:50:08,780 very hard to get anyone to do. 1125 00:50:08,780 --> 00:50:13,080 They're low paid, they're unpleasant. 1126 00:50:13,080 --> 00:50:17,870 And we're automating the dull and dreary aspects 1127 00:50:17,870 --> 00:50:19,610 of human existence. 1128 00:50:19,610 --> 00:50:21,800 And that gives the opportunity for people 1129 00:50:21,800 --> 00:50:23,750 to have more fulfilling lives. 1130 00:50:23,750 --> 00:50:26,570 Now, the problem isn't that we're 1131 00:50:26,570 --> 00:50:29,510 doing this great work to get robots or machines 1132 00:50:29,510 --> 00:50:30,940 to do these things for us. 1133 00:50:30,940 --> 00:50:33,410 It's that, as a society, we're not 1134 00:50:33,410 --> 00:50:36,230 thinking about how we adjust to that, 1135 00:50:36,230 --> 00:50:39,160 how we make sure people will have fulfilling lives 1136 00:50:39,160 --> 00:50:43,280 and will be supported materially to enjoy that prosperity. 1137 00:50:43,280 --> 00:50:45,620 So I think it's disruptive in many ways, 1138 00:50:45,620 --> 00:50:47,710 and it's going to be disruptive politically. 1139 00:50:47,710 --> 00:50:49,460 And we're going to have to adapt. 1140 00:50:49,460 --> 00:50:52,940 Because if you're not working, then you 1141 00:50:52,940 --> 00:50:55,030 have to be supported to enjoy your life. 1142 00:50:55,030 --> 00:50:58,650 And maybe that means a change in the political system. 1143 00:50:58,650 --> 00:51:01,790 So those are questions perhaps not for us. 1144 00:51:01,790 --> 00:51:03,560 But I think we maybe-- 1145 00:51:03,560 --> 00:51:05,960 as the technologists, we have to be prepared 1146 00:51:05,960 --> 00:51:09,800 to admit that what we're working on are really disrupted systems 1147 00:51:09,800 --> 00:51:11,780 and they are going to have these large impacts. 1148 00:51:11,780 --> 00:51:13,790 And people are waking up to that. 1149 00:51:13,790 --> 00:51:16,620 And if we wave our hands and say, don't worry, 1150 00:51:16,620 --> 00:51:20,810 I think we're not going to be taken seriously. 1151 00:51:20,810 --> 00:51:23,030 PATRICK WINSTON: Other thoughts? 1152 00:51:23,030 --> 00:51:26,420 JOHN LEONARD: I see how these are really important questions. 1153 00:51:26,420 --> 00:51:28,070 And I see-- 1154 00:51:28,070 --> 00:51:29,000 I have mixed emotions. 1155 00:51:29,000 --> 00:51:30,680 I'm really torn. 1156 00:51:30,680 --> 00:51:32,360 I came from a family that was affected 1157 00:51:32,360 --> 00:51:33,920 by unemployment in the 1970s. 1158 00:51:33,920 --> 00:51:39,140 So I feel like I'm very sympathetic to the potential 1159 00:51:39,140 --> 00:51:40,440 for losing jobs. 1160 00:51:40,440 --> 00:51:43,580 At CSAIL we've had this wonderful discussion 1161 00:51:43,580 --> 00:51:46,100 with some economists at MIT the last few years, 1162 00:51:46,100 --> 00:51:53,780 Frank Levy, David Autor, and Eric Brynjolfsson, Andy McAfee, 1163 00:51:53,780 --> 00:51:55,630 and I've learned a lot from them. 1164 00:51:55,630 --> 00:51:58,100 And I think that they vary in their views. 1165 00:51:58,100 --> 00:52:00,994 I think I am more along the lines of someone 1166 00:52:00,994 --> 00:52:02,660 like David Autor, who's an economist who 1167 00:52:02,660 --> 00:52:05,780 thinks that we shouldn't fear too 1168 00:52:05,780 --> 00:52:07,320 rapid a replacement of robots. 1169 00:52:07,320 --> 00:52:10,310 If you look at the data, that the things that are-- 1170 00:52:10,310 --> 00:52:13,920 I would say the things that are hard for robots are still hard. 1171 00:52:13,920 --> 00:52:17,030 But on the other hand, I think, longer term, we 1172 00:52:17,030 --> 00:52:19,370 do have to be mindful of as a society, that like, 1173 00:52:19,370 --> 00:52:22,046 as Russ said, things like this are going to happen. 1174 00:52:22,046 --> 00:52:24,420 I think that the short term introduction, if you look at, 1175 00:52:24,420 --> 00:52:27,060 for example, Kiva and how they they've changed the way 1176 00:52:27,060 --> 00:52:27,920 a warehouse works. 1177 00:52:27,920 --> 00:52:31,460 I think replacing humans just completely with robots, 1178 00:52:31,460 --> 00:52:33,950 like, say, for gardening or agriculture, some really 1179 00:52:33,950 --> 00:52:37,310 hard things to do, because the problems are so hard. 1180 00:52:37,310 --> 00:52:41,000 But if you rethink the task to have humans and robots working 1181 00:52:41,000 --> 00:52:43,940 together, Kiva's a good example of how you actually 1182 00:52:43,940 --> 00:52:44,976 can change things. 1183 00:52:44,976 --> 00:52:46,850 And so that's where I think the short term is 1184 00:52:46,850 --> 00:52:49,490 going to come from, is humans and robots working together. 1185 00:52:49,490 --> 00:52:52,744 That's why I think HRI is such an important topic. 1186 00:52:52,744 --> 00:52:54,160 PATRICK WINSTON: Well I don't know 1187 00:52:54,160 --> 00:52:57,560 if you running for president, but be that is it may, 1188 00:52:57,560 --> 00:52:59,740 do any of you have a one-minute closing 1189 00:52:59,740 --> 00:53:02,412 statement you like to make? 1190 00:53:02,412 --> 00:53:04,870 JOHN LEONARD: Well I'll go sort of the deep learning thing. 1191 00:53:04,870 --> 00:53:10,900 I think in robotics we have this, potentially, a coming 1192 00:53:10,900 --> 00:53:12,760 divide between the folks that believe more 1193 00:53:12,760 --> 00:53:16,070 in that data-driven learning methods and more models. 1194 00:53:16,070 --> 00:53:19,580 And I'm a believer more on the model-based side, 1195 00:53:19,580 --> 00:53:21,895 that we don't have enough data and enough systems. 1196 00:53:25,690 --> 00:53:29,530 But I do fear that we could be in a society-- 1197 00:53:29,530 --> 00:53:33,010 for certain classes of problems, he or she who has the data 1198 00:53:33,010 --> 00:53:35,830 may win, in terms of if Google or Facebook have 1199 00:53:35,830 --> 00:53:38,740 just such massive amounts of data for certain problems 1200 00:53:38,740 --> 00:53:40,561 that academics can't compete. 1201 00:53:40,561 --> 00:53:43,060 So I do feel there's a place for the professor and the seven 1202 00:53:43,060 --> 00:53:45,340 grad students and a couple of post-docs. 1203 00:53:45,340 --> 00:53:46,964 But if you do you have to be careful 1204 00:53:46,964 --> 00:53:48,880 in terms of problem selection, that you're not 1205 00:53:48,880 --> 00:53:52,030 going right up against the sort of one of these data machine 1206 00:53:52,030 --> 00:53:52,655 companies. 1207 00:53:52,655 --> 00:53:54,280 RUSS TEDRAKE: I was going to say that I 1208 00:53:54,280 --> 00:53:56,030 was looking at humans right now and trying 1209 00:53:56,030 --> 00:53:59,050 to inform the robots, I wouldn't look 1210 00:53:59,050 --> 00:54:01,577 at center-out reaching movements or nominal walking 1211 00:54:01,577 --> 00:54:02,410 or things like this. 1212 00:54:02,410 --> 00:54:04,120 I'd be pushing for the corner cases. 1213 00:54:04,120 --> 00:54:06,460 I've been trying to really understand 1214 00:54:06,460 --> 00:54:09,920 the performance of biological intelligence 1215 00:54:09,920 --> 00:54:13,720 in the screw cases, in the cases where they didn't 1216 00:54:13,720 --> 00:54:15,250 have a lot of prior data. 1217 00:54:15,250 --> 00:54:18,070 They were once in a lifetime experiences. 1218 00:54:18,070 --> 00:54:20,980 How did natural intelligence respond? 1219 00:54:20,980 --> 00:54:23,740 That's, I think, a grand challenge 1220 00:54:23,740 --> 00:54:26,840 for us on the computational intelligence side. 1221 00:54:26,840 --> 00:54:29,964 And maybe there's a lot to learn. 1222 00:54:29,964 --> 00:54:31,630 PATRICK WINSTON: And the grand challenge 1223 00:54:31,630 --> 00:54:33,820 for me, to conclude all this, has 1224 00:54:33,820 --> 00:54:38,230 to do with what it would really take to make a robot humanoid. 1225 00:54:38,230 --> 00:54:41,050 And I've been thinking a lot about that 1226 00:54:41,050 --> 00:54:44,050 recently in connection with self-awareness, 1227 00:54:44,050 --> 00:54:47,290 understanding the story, having the robot understand 1228 00:54:47,290 --> 00:54:50,080 the story of what's going on throughout the day, 1229 00:54:50,080 --> 00:54:53,530 having it able to use previous experiences to guide 1230 00:54:53,530 --> 00:54:55,880 its future experiences, and so on. 1231 00:54:55,880 --> 00:54:58,090 So there's a lot to be done, that's for sure. 1232 00:54:58,090 --> 00:55:01,064 And I'm sure we'll be working together as time goes on. 1233 00:55:01,064 --> 00:55:02,730 Now I'd just like to thank the panelists 1234 00:55:02,730 --> 00:55:05,480 and conclude the evening.