1 00:00:00,500 --> 00:00:02,890 The following content is provided under a Creative 2 00:00:02,890 --> 00:00:04,430 Commons license. 3 00:00:04,430 --> 00:00:06,730 Your support will help MIT OpenCourseWare 4 00:00:06,730 --> 00:00:11,120 continue to offer high quality educational resources for free. 5 00:00:11,120 --> 00:00:13,720 To make a donation or view additional materials 6 00:00:13,720 --> 00:00:17,680 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:17,680 --> 00:00:20,770 at ocw.mit.edu. 8 00:00:20,770 --> 00:00:23,050 JUSTIN CURRY: All right, welcome back everybody. 9 00:00:23,050 --> 00:00:25,300 First of all, I want to apologize to the virtual world 10 00:00:25,300 --> 00:00:29,080 out there that we didn't get last lecture on tape. 11 00:00:29,080 --> 00:00:30,700 So I want to do a quick recap. 12 00:00:30,700 --> 00:00:32,500 Honestly, I mean, no more than two or three 13 00:00:32,500 --> 00:00:33,994 minutes on what happened. 14 00:00:33,994 --> 00:00:36,160 And that's why I put all the effort in putting it up 15 00:00:36,160 --> 00:00:37,020 on the board first. 16 00:00:42,475 --> 00:00:44,600 Remember, we're dealing with a formal system, here. 17 00:00:44,600 --> 00:00:48,140 And here, I'm just going to denote this formal system as F. 18 00:00:48,140 --> 00:00:50,800 Recently, we've been interested in typographical number theory, 19 00:00:50,800 --> 00:00:52,760 TNT. 20 00:00:52,760 --> 00:00:57,650 And we carried out this process of encoding symbols of F 21 00:00:57,650 --> 00:01:00,110 through this Godel numbering process, 22 00:01:00,110 --> 00:01:06,270 essentially giving equals a number 555 things like this. 23 00:01:06,270 --> 00:01:08,390 So we have now symbols of F corresponding 24 00:01:08,390 --> 00:01:13,100 to a small subset of all natural numbers. 25 00:01:13,100 --> 00:01:14,540 We then have this correspondence, 26 00:01:14,540 --> 00:01:16,820 this actually exact mathematically precise 27 00:01:16,820 --> 00:01:20,960 isomorphism, between strings of F and subset 28 00:01:20,960 --> 00:01:24,580 of all numbers, which is the Godel number of some string. 29 00:01:24,580 --> 00:01:28,490 The strings that we have over there. 30 00:01:28,490 --> 00:01:31,370 We then take our rules, axioms and rules of F-- remember, 31 00:01:31,370 --> 00:01:34,580 these were formal, recursive operations 32 00:01:34,580 --> 00:01:37,610 on strings in our formal system. 33 00:01:37,610 --> 00:01:39,440 And we arithmetize them. 34 00:01:39,440 --> 00:01:42,470 We said, OK, we're going to let the process of induction 35 00:01:42,470 --> 00:01:45,520 be equivalent to taking 10 times blah, 36 00:01:45,520 --> 00:01:47,770 blah, blah, blah, blah, this number, blah, blah, blah, 37 00:01:47,770 --> 00:01:50,090 blah, divided by blah, blah, blah, Chinese Remainder 38 00:01:50,090 --> 00:01:52,100 Theorem, blah, blah, blah. 39 00:01:52,100 --> 00:01:54,920 And we're going to actually be able to do 40 00:01:54,920 --> 00:01:58,220 the same simple shunting we would do with the MIU system 41 00:01:58,220 --> 00:02:00,620 or with the typographical number theory system, 42 00:02:00,620 --> 00:02:02,432 and just do it in the form of numbers. 43 00:02:02,432 --> 00:02:04,640 And this is why, with Godel's Incompleteness Theorem, 44 00:02:04,640 --> 00:02:06,890 we need formal systems strong enough 45 00:02:06,890 --> 00:02:10,820 that encompass number theory in order to do this process. 46 00:02:10,820 --> 00:02:12,650 So we then have strings which are 47 00:02:12,650 --> 00:02:15,140 statements of F, which correspond to numbers 48 00:02:15,140 --> 00:02:17,350 that are F producible. 49 00:02:17,350 --> 00:02:20,450 We then kind of do this leap outside of the system. 50 00:02:20,450 --> 00:02:23,610 We have meta F, right? 51 00:02:23,610 --> 00:02:26,310 And we're taking statements about strings 52 00:02:26,310 --> 00:02:28,710 of our formal system. 53 00:02:28,710 --> 00:02:30,294 And these correspond to-- 54 00:02:30,294 --> 00:02:31,710 this corresponds to number theory, 55 00:02:31,710 --> 00:02:35,940 taking statements about numbers which are F producible, which 56 00:02:35,940 --> 00:02:37,370 are f numbers. 57 00:02:37,370 --> 00:02:43,200 An example of this was in number theory, say, 641 is prime 58 00:02:43,200 --> 00:02:47,070 or 6 is a perfect number, and we just denote this property, 59 00:02:47,070 --> 00:02:48,090 perfect number. 60 00:02:48,090 --> 00:02:49,500 And we can give it-- 61 00:02:49,500 --> 00:02:53,330 we can make it a property with a free variable. 62 00:02:53,330 --> 00:02:55,101 And he's like, is 5 a perfect number? 63 00:02:55,101 --> 00:02:55,600 No. 64 00:02:55,600 --> 00:02:56,840 Is 6 a perfect number? 65 00:02:56,840 --> 00:02:58,850 Yes. 66 00:02:58,850 --> 00:03:01,490 We can create a new property called primness, which 67 00:03:01,490 --> 00:03:03,294 corresponds to provability. 68 00:03:03,294 --> 00:03:05,210 And then we essentially then have this problem 69 00:03:05,210 --> 00:03:07,580 of determining whether a particular string as a theorem 70 00:03:07,580 --> 00:03:10,250 of F is equivalent to establishing whether or not 71 00:03:10,250 --> 00:03:12,560 a number is prim. 72 00:03:12,560 --> 00:03:16,640 So I wanted to quickly recap, going back over here 73 00:03:16,640 --> 00:03:21,540 to the essential steps in Godel's proof. 74 00:03:21,540 --> 00:03:25,610 So we arithmetize-- just take that as it is. 75 00:03:25,610 --> 00:03:28,440 Arithmetize-- we give symbols, numbers, 76 00:03:28,440 --> 00:03:31,280 and we give rules for inference and deduction, 77 00:03:31,280 --> 00:03:33,330 operations on numbers. 78 00:03:33,330 --> 00:03:35,960 So then we make this property of probability, 79 00:03:35,960 --> 00:03:39,090 we equate it to this exact precise isomorphism, 80 00:03:39,090 --> 00:03:43,130 which Godel discovered, to a property of primness. 81 00:03:43,130 --> 00:03:45,110 We then turn the operation of quining-- 82 00:03:45,110 --> 00:03:48,860 so quining, when preceded by itself in quotations, 83 00:03:48,860 --> 00:03:49,900 yields a full sentence. 84 00:03:49,900 --> 00:03:51,650 Of course, we could play around with this. 85 00:03:51,650 --> 00:03:54,180 Like, snow is white, snow is white didn't mean anything. 86 00:03:54,180 --> 00:03:58,160 But when we fed that operation, that function itself, 87 00:03:58,160 --> 00:04:01,040 into that variable for-- 88 00:04:01,040 --> 00:04:02,780 when preceded by itself in quotations 89 00:04:02,780 --> 00:04:04,250 yields a full sentence. 90 00:04:04,250 --> 00:04:06,780 When preceded by itself in quotations yields 91 00:04:06,780 --> 00:04:09,920 the full sentence created a full sentence, which 92 00:04:09,920 --> 00:04:12,310 was itself self-referential. 93 00:04:12,310 --> 00:04:15,095 And it was a fixed point, which, remember, 94 00:04:15,095 --> 00:04:18,769 is when you have a function-- which was in our case 95 00:04:18,769 --> 00:04:23,690 a property of quining or it could have been just simply 96 00:04:23,690 --> 00:04:24,890 multiplying by 2x. 97 00:04:24,890 --> 00:04:26,792 A fixed point is something which when 98 00:04:26,792 --> 00:04:28,250 you feed it into your function, you 99 00:04:28,250 --> 00:04:30,200 get the same thing back out. 100 00:04:30,200 --> 00:04:34,640 So we took quining into an operation on numbers. 101 00:04:34,640 --> 00:04:36,115 Our fixed point of quining gave us 102 00:04:36,115 --> 00:04:38,780 an inherently self-referential sentence. 103 00:04:38,780 --> 00:04:41,120 And then we were able to describe-- not directly spell 104 00:04:41,120 --> 00:04:47,430 out, but describe a formula which felt like this statement 105 00:04:47,430 --> 00:04:49,230 here-- when fed its own Godel number, 106 00:04:49,230 --> 00:04:50,460 yields a non-prim number. 107 00:04:50,460 --> 00:04:51,910 And when fed its own grow number, 108 00:04:51,910 --> 00:04:53,790 yields a non-prim number. 109 00:04:53,790 --> 00:04:56,010 Which describes this big thing called G. 110 00:04:56,010 --> 00:04:58,950 And G is essentially the reason why 111 00:04:58,950 --> 00:05:00,360 number theory is incomplete. 112 00:05:00,360 --> 00:05:02,640 It's what breaks the mathematician's credo, 113 00:05:02,640 --> 00:05:06,010 this idea of true if, and only if, provable. 114 00:05:06,010 --> 00:05:08,460 And with the production of G, we were 115 00:05:08,460 --> 00:05:11,940 able to establish that although all provable things are true, 116 00:05:11,940 --> 00:05:13,890 not necessarily all true things are provable. 117 00:05:13,890 --> 00:05:15,270 And this was a very important idea. 118 00:05:15,270 --> 00:05:16,978 It's the take home message of the course. 119 00:05:19,096 --> 00:05:21,720 We connected this briefly to the halting problem, which is just 120 00:05:21,720 --> 00:05:24,947 this idea you can't have a magical machine which you can 121 00:05:24,947 --> 00:05:27,030 take in a program and an input, and it'll tell you 122 00:05:27,030 --> 00:05:28,321 whether or not it'll terminate. 123 00:05:28,321 --> 00:05:31,681 Just like you can't have a magical machine which says, 124 00:05:31,681 --> 00:05:33,930 give me the Godel number for some statement and number 125 00:05:33,930 --> 00:05:35,110 theory, and I'll tell you whether or not 126 00:05:35,110 --> 00:05:37,410 it's true or false just by detecting whether or not 127 00:05:37,410 --> 00:05:40,140 it's prim or not prim. 128 00:05:40,140 --> 00:05:42,669 These things are inherently impossible and undecidable, 129 00:05:42,669 --> 00:05:45,210 and this was kind of the shaking of the foundations of number 130 00:05:45,210 --> 00:05:47,180 theory, which we wanted to do. 131 00:05:49,760 --> 00:05:53,810 But that's all I wanted to say about this. 132 00:05:53,810 --> 00:05:56,004 I want us to move on, and now think 133 00:05:56,004 --> 00:05:57,170 about more enjoyable things. 134 00:05:57,170 --> 00:05:59,704 Because number theory, great. 135 00:05:59,704 --> 00:06:01,370 I wanted more just the take home message 136 00:06:01,370 --> 00:06:05,090 of this idea of having things which are true, 137 00:06:05,090 --> 00:06:06,700 but not within our system. 138 00:06:06,700 --> 00:06:07,220 Right? 139 00:06:07,220 --> 00:06:09,440 And the idea that we have to jump outside the system. 140 00:06:09,440 --> 00:06:09,980 Yes, Atif? 141 00:06:09,980 --> 00:06:11,730 AUDIENCE: Do you feel like something where 142 00:06:11,730 --> 00:06:13,640 we can have a formal system, then 143 00:06:13,640 --> 00:06:15,830 we get all the Godel statements, then 144 00:06:15,830 --> 00:06:18,069 get former systems in which those [? hold. ?] 145 00:06:18,069 --> 00:06:20,360 And so by doing, that would generate more formal system 146 00:06:20,360 --> 00:06:25,910 and then get their true statements about them 147 00:06:25,910 --> 00:06:28,285 that are forcing that system for it to be consistent. 148 00:06:28,285 --> 00:06:30,200 Do this-- all of the systems, you can come out wit. 149 00:06:30,200 --> 00:06:32,699 Can you like, find out where they create all the mathematics 150 00:06:32,699 --> 00:06:33,902 theories? 151 00:06:33,902 --> 00:06:35,360 JUSTIN CURRY: Right, so essentially 152 00:06:35,360 --> 00:06:39,870 this idea of going ahead and giving 153 00:06:39,870 --> 00:06:42,300 Godel numbers for our axioms and our rules of inference, 154 00:06:42,300 --> 00:06:44,520 and then just being able to essentially generate 155 00:06:44,520 --> 00:06:46,710 all of mathematics by just feeding it to a computer 156 00:06:46,710 --> 00:06:48,900 and let it carry out these operations and numbers. 157 00:06:48,900 --> 00:06:49,930 Right? 158 00:06:49,930 --> 00:06:53,010 So we could produce a computer which 159 00:06:53,010 --> 00:06:55,752 would produce provable things in that direct fashion, 160 00:06:55,752 --> 00:06:57,210 but it'd be incredibly inefficient. 161 00:06:57,210 --> 00:06:57,780 Right? 162 00:06:57,780 --> 00:06:59,040 Like, one of the remarkable things 163 00:06:59,040 --> 00:07:00,540 about humans and the human intellect 164 00:07:00,540 --> 00:07:03,870 is that we're able to essentially jump several nodes 165 00:07:03,870 --> 00:07:05,820 down the tree, and not work-- we don't 166 00:07:05,820 --> 00:07:08,640 think on the level of formal deductions 167 00:07:08,640 --> 00:07:10,470 on operations of symbols. 168 00:07:10,470 --> 00:07:11,880 Right? 169 00:07:11,880 --> 00:07:13,794 And also, once again, that's the whole point 170 00:07:13,794 --> 00:07:15,210 of Godel's Incompleteness Theorem, 171 00:07:15,210 --> 00:07:18,270 is that we can't produce all mathematics. 172 00:07:18,270 --> 00:07:20,477 AUDIENCE: But you can certainly try 173 00:07:20,477 --> 00:07:25,540 to direct statements that are almost similar to what he says. 174 00:07:25,540 --> 00:07:28,320 And then gets systems in which those statements, 175 00:07:28,320 --> 00:07:30,205 which are causing some of the [? system ?] 176 00:07:30,205 --> 00:07:31,080 code for that system. 177 00:07:31,080 --> 00:07:33,839 And then try to explore that other system to see 178 00:07:33,839 --> 00:07:35,539 which mathematics it gives you. 179 00:07:35,539 --> 00:07:37,080 JUSTIN CURRY: OK, so is the idea then 180 00:07:37,080 --> 00:07:40,290 that we essentially take our G here, 181 00:07:40,290 --> 00:07:42,810 and we let that be essentially an axiom 182 00:07:42,810 --> 00:07:44,430 of a new system, and then-- 183 00:07:44,430 --> 00:07:47,548 AUDIENCE: Just like get a system, in which that thing can 184 00:07:47,548 --> 00:07:49,190 be derived to be true. 185 00:07:49,190 --> 00:07:51,140 So just speaking about another system, 186 00:07:51,140 --> 00:07:52,360 by another system which-- 187 00:07:52,360 --> 00:07:53,568 JUSTIN CURRY: Right, exactly. 188 00:07:53,568 --> 00:07:55,730 So essentially, we would have to-- 189 00:07:55,730 --> 00:07:59,810 it's almost trying to make our system complete by considering 190 00:07:59,810 --> 00:08:02,030 a system where G is derivable. 191 00:08:02,030 --> 00:08:03,809 Right, exactly. 192 00:08:03,809 --> 00:08:05,100 No, I mean, that's totally key. 193 00:08:05,100 --> 00:08:08,115 And mathematicians gave exactly the same point 194 00:08:08,115 --> 00:08:08,990 after Godel did this. 195 00:08:08,990 --> 00:08:12,800 But what Godel showed is that even if you were to tack on G 196 00:08:12,800 --> 00:08:14,810 as part of your new system, you could then 197 00:08:14,810 --> 00:08:17,600 create an analogous G prime, which 198 00:08:17,600 --> 00:08:18,880 was incomplete in that system. 199 00:08:18,880 --> 00:08:20,713 AUDIENCE: I know, I know that's the problem, 200 00:08:20,713 --> 00:08:25,850 but since this [INAUDIBLE] is [INAUDIBLE] all 201 00:08:25,850 --> 00:08:26,860 of the mathematics. 202 00:08:26,860 --> 00:08:31,340 If we can just find them, and create new systems that 203 00:08:31,340 --> 00:08:33,780 sort of support them, then we can 204 00:08:33,780 --> 00:08:36,080 have other mathematics that are not 205 00:08:36,080 --> 00:08:37,549 covered by the other system. 206 00:08:37,549 --> 00:08:39,710 So it's just like a way of exploring 207 00:08:39,710 --> 00:08:41,114 the mathematical landscape. 208 00:08:41,114 --> 00:08:41,809 JUSTIN CURRY: Well, no, I mean, yeah. 209 00:08:41,809 --> 00:08:43,159 We're doing that all the time. 210 00:08:43,159 --> 00:08:45,490 Right? 211 00:08:45,490 --> 00:08:50,260 But it's not necessarily that we can just start out and somehow 212 00:08:50,260 --> 00:08:54,520 get the complement of provable state, of not provable state 213 00:08:54,520 --> 00:08:55,407 very easily. 214 00:08:55,407 --> 00:08:57,490 I mean, that's essentially what mathematicians do. 215 00:08:57,490 --> 00:08:59,073 That's why mathematicians never go out 216 00:08:59,073 --> 00:09:00,850 of a job, is that we fundamentally 217 00:09:00,850 --> 00:09:04,810 need to think about and create new concepts 218 00:09:04,810 --> 00:09:06,370 and make new definitions. 219 00:09:06,370 --> 00:09:10,390 And then explore the deductions of our new assumptions 220 00:09:10,390 --> 00:09:14,260 and axioms, which weren't covered by the old system 221 00:09:14,260 --> 00:09:15,696 involving number theory. 222 00:09:15,696 --> 00:09:17,070 We need to add new stuff on there 223 00:09:17,070 --> 00:09:18,820 that can't be done in a mechanical process 224 00:09:18,820 --> 00:09:21,560 or requires human intelligence. 225 00:09:21,560 --> 00:09:22,815 But, I mean, that's good. 226 00:09:22,815 --> 00:09:24,690 If you want to talk about this a little more, 227 00:09:24,690 --> 00:09:28,030 we should do it after class because we've 228 00:09:28,030 --> 00:09:29,200 got a lot to pack in today. 229 00:09:29,200 --> 00:09:32,160 But let me know if I haven't answered your question fully. 230 00:09:32,160 --> 00:09:35,570 We'll do it afterwards. 231 00:09:35,570 --> 00:09:38,480 So I wanted to quickly digress. 232 00:09:38,480 --> 00:09:42,340 I kind of need to take a stand back and say, OK, well 233 00:09:42,340 --> 00:09:44,650 Godel found this really nice thing. 234 00:09:44,650 --> 00:09:49,510 He found this isomorphism, essentially, between PM-- 235 00:09:49,510 --> 00:09:53,120 Principia Mathematica-- and the things it was describing. 236 00:09:53,120 --> 00:09:57,340 So he was able to get the system to talk about itself. 237 00:09:57,340 --> 00:09:59,200 But I think that always begs the question, 238 00:09:59,200 --> 00:10:01,116 what if he hadn't discovered this isomorphism? 239 00:10:01,116 --> 00:10:03,220 What if he hadn't discovered this kind 240 00:10:03,220 --> 00:10:05,590 of link, this analogy? 241 00:10:05,590 --> 00:10:08,200 And this actually relates to a story 242 00:10:08,200 --> 00:10:11,460 that Curran and I have recently. 243 00:10:11,460 --> 00:10:13,840 And Curran was coming up to Boston. 244 00:10:13,840 --> 00:10:17,332 And he says, all right, I'm coming to Kendall soon. 245 00:10:17,332 --> 00:10:18,340 All right, great. 246 00:10:18,340 --> 00:10:20,440 So I kind of jokingly texted back. 247 00:10:20,440 --> 00:10:23,630 I said, OK, have space suit, will travel. 248 00:10:23,630 --> 00:10:24,190 Right? 249 00:10:24,190 --> 00:10:25,460 So Curran texts back. 250 00:10:25,460 --> 00:10:27,200 He goes, ha, ha, ha, excellent. 251 00:10:27,200 --> 00:10:29,200 So I'm like, OK, great, he got my joke. 252 00:10:31,707 --> 00:10:33,040 And then I went and I was like-- 253 00:10:33,040 --> 00:10:35,470 so we met up and I was like, how did you like the joke? 254 00:10:35,470 --> 00:10:37,425 And he was like, yeah, I thought it was funny. 255 00:10:37,425 --> 00:10:39,550 And I was like, well, you got the reference, right? 256 00:10:39,550 --> 00:10:41,335 And he was like, no. 257 00:10:41,335 --> 00:10:42,460 What are you talking about? 258 00:10:42,460 --> 00:10:45,430 Well, I'm like, well, Have Space Suit--Will Travel is a book 259 00:10:45,430 --> 00:10:47,837 by Robert Heinlein. 260 00:10:47,837 --> 00:10:49,670 And he was like, no, I've never heard of it. 261 00:10:49,670 --> 00:10:50,530 I just thought you were like putting 262 00:10:50,530 --> 00:10:52,300 on this metaphorical space suit. 263 00:10:52,300 --> 00:10:54,490 And that you were going to go meet me at the T, 264 00:10:54,490 --> 00:10:56,771 and that's why you could travel. 265 00:10:56,771 --> 00:10:57,520 And I was like oh! 266 00:10:57,520 --> 00:10:58,030 Ha, ha, ha. 267 00:10:58,030 --> 00:11:00,321 I think that's funny that you thought I was funny, even 268 00:11:00,321 --> 00:11:01,971 though you didn't get my joke. 269 00:11:01,971 --> 00:11:04,470 And then so we kind of talked about this for a little while. 270 00:11:04,470 --> 00:11:05,720 And he's like, well, so what's the book about? 271 00:11:05,720 --> 00:11:06,780 And I said, actually, I don't know. 272 00:11:06,780 --> 00:11:07,760 I haven't read it. 273 00:11:07,760 --> 00:11:09,340 So then I Wikipedia-ed it today. 274 00:11:09,340 --> 00:11:10,890 And I was thinking, well, actually 275 00:11:10,890 --> 00:11:12,680 at the end of the book, the main character 276 00:11:12,680 --> 00:11:15,207 gets a full scholarship at MIT. 277 00:11:15,207 --> 00:11:17,790 So there is like a third layer of meaning, which neither of us 278 00:11:17,790 --> 00:11:18,270 knew. 279 00:11:18,270 --> 00:11:20,020 And it was just this level of isomorphism, 280 00:11:20,020 --> 00:11:22,360 which we didn't know. 281 00:11:22,360 --> 00:11:24,982 But everything seemed to be operating just fine. 282 00:11:24,982 --> 00:11:26,440 So I think it's kind of interesting 283 00:11:26,440 --> 00:11:28,273 you have this idea that, what if we had just 284 00:11:28,273 --> 00:11:31,390 gone straight forward with PM, without knowing 285 00:11:31,390 --> 00:11:32,710 any kind of analogies-- 286 00:11:32,710 --> 00:11:35,220 this ability for the system to self-talk? 287 00:11:35,220 --> 00:11:36,850 Like, what would have happened? 288 00:11:36,850 --> 00:11:38,740 Would have mathematics just marched on fine, 289 00:11:38,740 --> 00:11:41,470 thinking it was complete and everything? 290 00:11:41,470 --> 00:11:44,570 I just think that's kind of an interesting idea. 291 00:11:44,570 --> 00:11:49,400 But I want to give a quick signpost 292 00:11:49,400 --> 00:11:51,360 of what's happening today. 293 00:11:51,360 --> 00:11:53,220 How many of you guys read chapter 10, 294 00:11:53,220 --> 00:11:55,480 the levels of description of a computer system? 295 00:11:55,480 --> 00:11:56,646 Or at least started to read? 296 00:11:56,646 --> 00:11:58,230 Great, excellent, fantastic. 297 00:11:58,230 --> 00:12:01,637 Nobody read the wrong chapter, which was chapter 16. 298 00:12:01,637 --> 00:12:03,720 That's a good chapter but, it's little advanced in 299 00:12:03,720 --> 00:12:06,420 that it uses a lot of concepts from chapter 13 300 00:12:06,420 --> 00:12:10,740 and 14, which really help hit home the idea of Godel's proof. 301 00:12:10,740 --> 00:12:13,900 And you should read in your own time. 302 00:12:13,900 --> 00:12:17,590 What I want to talk about today is this concept 303 00:12:17,590 --> 00:12:23,260 of emergence and emergent properties coming out 304 00:12:23,260 --> 00:12:24,290 of simple descriptions. 305 00:12:24,290 --> 00:12:24,790 Right? 306 00:12:24,790 --> 00:12:27,370 We've been kind of hitting this home all the time, 307 00:12:27,370 --> 00:12:31,900 but today's kind of our last day to just really try 308 00:12:31,900 --> 00:12:38,450 to show complexity and show how its emergence out 309 00:12:38,450 --> 00:12:42,710 of a simple building items, out of simple building items-- 310 00:12:42,710 --> 00:12:45,400 sorry. 311 00:12:45,400 --> 00:12:48,630 And, really, what this chapter here starts with, and why we-- 312 00:12:48,630 --> 00:12:51,390 I mean, why do you think we talk about computer systems? 313 00:12:51,390 --> 00:12:53,797 Does anyone have an idea? 314 00:12:53,797 --> 00:12:55,630 Why do you-- did anyone find it interesting? 315 00:12:55,630 --> 00:12:57,505 Sandra. 316 00:12:57,505 --> 00:13:02,950 AUDIENCE: [INAUDIBLE] there just seems [INAUDIBLE] 317 00:13:02,950 --> 00:13:04,765 and break them down. 318 00:13:04,765 --> 00:13:06,140 JUSTIN CURRY: So we use computers 319 00:13:06,140 --> 00:13:08,557 in the same way that we think, in terms of breaking down-- 320 00:13:08,557 --> 00:13:10,848 AUDIENCE: Yeah, and that's how keep [? coming around ?] 321 00:13:10,848 --> 00:13:12,240 and fix it [? logically. ?] 322 00:13:12,240 --> 00:13:13,670 JUSTIN CURRY: Right, exactly. 323 00:13:13,670 --> 00:13:16,056 So go ahead. 324 00:13:16,056 --> 00:13:18,984 AUDIENCE: So to base it off relating to your system 325 00:13:18,984 --> 00:13:21,912 [INAUDIBLE] programs. 326 00:13:21,912 --> 00:13:27,170 Is it based on the way we think, or is it in [? alphabet? ?] 327 00:13:27,170 --> 00:13:29,550 [INAUDIBLE]. 328 00:13:29,550 --> 00:13:31,209 JUSTIN CURRY: All right. 329 00:13:31,209 --> 00:13:33,250 Let me make sure I understand what you're saying. 330 00:13:33,250 --> 00:13:35,490 So are you asking, are the programs 331 00:13:35,490 --> 00:13:38,729 that we program just based on how we think? 332 00:13:38,729 --> 00:13:40,520 Like, essentially it's just our idea-- hey, 333 00:13:40,520 --> 00:13:41,940 we want the computer to do this. 334 00:13:41,940 --> 00:13:43,580 AUDIENCE: Yeah, and how it's universal. 335 00:13:43,580 --> 00:13:45,058 JUSTIN CURRY: And how its-- 336 00:13:45,058 --> 00:13:46,805 AUDIENCE: [INAUDIBLE] 337 00:13:46,805 --> 00:13:48,305 JUSTIN CURRY: So how its computation 338 00:13:48,305 --> 00:13:50,139 is kind of universal. 339 00:13:50,139 --> 00:13:51,430 Is that what you're getting at? 340 00:13:54,040 --> 00:13:55,654 OK, yeah. 341 00:13:55,654 --> 00:13:57,070 I mean, so that kind of leads back 342 00:13:57,070 --> 00:13:59,150 to a fundamental idea of a Turing machine. 343 00:13:59,150 --> 00:13:59,650 Right? 344 00:14:02,580 --> 00:14:05,250 Although we subjectively have this own approach to how 345 00:14:05,250 --> 00:14:07,290 we'd solve the problem, right? 346 00:14:07,290 --> 00:14:11,020 It can ultimately be reduced to an algorithm, 347 00:14:11,020 --> 00:14:12,770 a set of instructions, which is universal. 348 00:14:12,770 --> 00:14:15,090 It's as universal as mathematics, right? 349 00:14:15,090 --> 00:14:17,700 So pretty much anybody in any language should understand it. 350 00:14:17,700 --> 00:14:19,230 Right? 351 00:14:19,230 --> 00:14:21,290 Is that kind of what you wanted to say? 352 00:14:21,290 --> 00:14:24,660 OK, Navine-- oh, sorry. 353 00:14:24,660 --> 00:14:25,320 Atif. 354 00:14:25,320 --> 00:14:27,153 AUDIENCE: Could we just be an implementation 355 00:14:27,153 --> 00:14:28,671 of a Turing machine? 356 00:14:28,671 --> 00:14:29,670 JUSTIN CURRY: Say again? 357 00:14:29,670 --> 00:14:31,503 AUDIENCE: Could we be just an implementation 358 00:14:31,503 --> 00:14:32,640 of a Turing machine? 359 00:14:32,640 --> 00:14:33,931 JUSTIN CURRY: Could we just be? 360 00:14:33,931 --> 00:14:34,620 AUDIENCE: Yeah. 361 00:14:34,620 --> 00:14:36,965 JUSTIN CURRY: Like, us as humans be implementations? 362 00:14:36,965 --> 00:14:38,423 AUDIENCE: Yeah, and the motions are 363 00:14:38,423 --> 00:14:41,014 like different ways of talking about some part of a change. 364 00:14:41,014 --> 00:14:42,180 JUSTIN CURRY: OK, all right. 365 00:14:42,180 --> 00:14:44,388 So you're getting at a really interesting idea, here. 366 00:14:44,388 --> 00:14:48,540 This idea that humans, in our thoughts, are actually best 367 00:14:48,540 --> 00:14:51,706 modeled by a Turing machine, right? 368 00:14:51,706 --> 00:14:53,580 And I don't know if you've done this reading, 369 00:14:53,580 --> 00:14:55,050 and maybe that's why you're being clever, and asking 370 00:14:55,050 --> 00:14:56,760 this question, but have you heard 371 00:14:56,760 --> 00:14:58,700 of someone named Roger Penrose? 372 00:14:58,700 --> 00:14:59,454 AUDIENCE: Yeah. 373 00:14:59,454 --> 00:15:00,870 JUSTIN CURRY: OK, so Roger Penrose 374 00:15:00,870 --> 00:15:04,980 is a very prominent mathematical physicist at Oxford, right? 375 00:15:04,980 --> 00:15:08,490 And he also wrote a pretty interesting book called Shadows 376 00:15:08,490 --> 00:15:11,310 of the Mind, in which he suggests 377 00:15:11,310 --> 00:15:15,960 that Godel's Incompleteness Theorem and the halting problem 378 00:15:15,960 --> 00:15:20,100 has application to human intelligence. 379 00:15:20,100 --> 00:15:24,420 But what he argues is that humans are fundamentally 380 00:15:24,420 --> 00:15:28,420 not Turing machines, but that computers are. 381 00:15:28,420 --> 00:15:28,920 Right? 382 00:15:28,920 --> 00:15:31,086 And that's why artificial intelligence, in his mind, 383 00:15:31,086 --> 00:15:35,110 is impossible, is because a machine is always 384 00:15:35,110 --> 00:15:37,120 limited to these kind of constraints, 385 00:15:37,120 --> 00:15:38,920 which we've cooked up here today and 386 00:15:38,920 --> 00:15:40,210 through the halting problem. 387 00:15:40,210 --> 00:15:41,800 But humans, obviously, aren't. 388 00:15:41,800 --> 00:15:42,580 We can meta think. 389 00:15:42,580 --> 00:15:43,996 We can meta meta think, and we can 390 00:15:43,996 --> 00:15:47,020 meta meta meta think, always jumping outside the system. 391 00:15:47,020 --> 00:15:47,890 Right? 392 00:15:47,890 --> 00:15:51,550 But we're never ever bound to the same constraints 393 00:15:51,550 --> 00:15:52,869 of a machine. 394 00:15:52,869 --> 00:15:53,660 Now, I'll warn you. 395 00:15:53,660 --> 00:15:54,535 Penrose's ideas are-- 396 00:15:54,535 --> 00:15:56,090 AUDIENCE: Who constrains the machine? 397 00:15:56,090 --> 00:15:57,000 JUSTIN CURRY: Who constrains the machine? 398 00:15:57,000 --> 00:15:57,999 AUDIENCE: The ones who-- 399 00:15:57,999 --> 00:15:58,910 JUSTIN CURRY: Logic. 400 00:15:58,910 --> 00:15:59,980 Logic, right? 401 00:15:59,980 --> 00:16:04,240 The fact that all a machine can do is not, and, copy, jump-- 402 00:16:04,240 --> 00:16:06,580 basic assembly code, right? 403 00:16:06,580 --> 00:16:07,780 It is just a Turing machine. 404 00:16:07,780 --> 00:16:09,250 AUDIENCE: But what can a human do? 405 00:16:09,250 --> 00:16:12,180 JUSTIN CURRY: What can a human do that's not a Turing machine? 406 00:16:12,180 --> 00:16:13,039 Good question. 407 00:16:13,039 --> 00:16:14,080 We can think emotionally. 408 00:16:14,080 --> 00:16:16,630 We can believe-- we can double think. 409 00:16:16,630 --> 00:16:19,860 Humans, all the time, believe in both P and not P. 410 00:16:19,860 --> 00:16:21,880 AUDIENCE: Yeah, but it's like-- 411 00:16:21,880 --> 00:16:25,320 humans aren't like [INAUDIBLE] systems. 412 00:16:25,320 --> 00:16:28,810 I think we think we're one thing, when we're really 413 00:16:28,810 --> 00:16:29,470 many things. 414 00:16:29,470 --> 00:16:31,770 That can also be contradictory. 415 00:16:31,770 --> 00:16:34,390 It's like a bunch of little machines going together, 416 00:16:34,390 --> 00:16:36,144 like communicating with each other. 417 00:16:36,144 --> 00:16:37,810 They can contradict, but they can agree. 418 00:16:37,810 --> 00:16:39,460 JUSTIN CURRY: OK, so you've got essentially this kind 419 00:16:39,460 --> 00:16:41,650 of Society of the Mind viewpoint-- you know, 420 00:16:41,650 --> 00:16:43,690 Marvin Minsky's book-- 421 00:16:43,690 --> 00:16:46,060 where human intelligence is best modeled 422 00:16:46,060 --> 00:16:49,090 by a bunch of little actors, right, that almost 423 00:16:49,090 --> 00:16:50,860 vote on certain beliefs. 424 00:16:50,860 --> 00:16:51,670 Right? 425 00:16:51,670 --> 00:16:54,005 And I mean, it's a good question. 426 00:16:54,005 --> 00:16:55,630 You also won't have to figure out, one, 427 00:16:55,630 --> 00:16:59,469 how do the actors work and how the actors make decisions? 428 00:16:59,469 --> 00:17:01,510 But two, do you really think it's just a majority 429 00:17:01,510 --> 00:17:02,800 rules inside your brain? 430 00:17:02,800 --> 00:17:07,960 Like little guys, little homunculus up in your head, 431 00:17:07,960 --> 00:17:11,040 voting on, no, I think today we do believe in God. 432 00:17:11,040 --> 00:17:12,947 And no, I don't believe in God. 433 00:17:12,947 --> 00:17:15,280 And then, all right, everyone raise your hand if you do. 434 00:17:15,280 --> 00:17:16,040 If you don't. 435 00:17:16,040 --> 00:17:16,810 OK. 436 00:17:16,810 --> 00:17:18,400 Today, we don't believe in God. 437 00:17:18,400 --> 00:17:19,060 Right? 438 00:17:19,060 --> 00:17:22,910 I mean, is that really how human intelligence works? 439 00:17:22,910 --> 00:17:24,079 We don't know. 440 00:17:24,079 --> 00:17:26,770 But that's the important thing about studying 441 00:17:26,770 --> 00:17:29,000 layers of systems, layers of description. 442 00:17:29,000 --> 00:17:31,780 And that's what this chapter is all about. 443 00:17:31,780 --> 00:17:34,270 So I want to give someone else an opportunity 444 00:17:34,270 --> 00:17:38,980 to explain, essentially, why do we study computer systems 445 00:17:38,980 --> 00:17:40,860 and trying to understand intelligence, 446 00:17:40,860 --> 00:17:42,500 trying to understand the mind? 447 00:17:42,500 --> 00:17:45,722 Does anyone have an idea? 448 00:17:45,722 --> 00:17:47,201 AUDIENCE: Well, it's so we can-- 449 00:17:50,320 --> 00:17:53,030 from things that we think we know we can do, 450 00:17:53,030 --> 00:17:55,490 you can try to make a comparative similar things 451 00:17:55,490 --> 00:17:58,530 in some way, just to maybe suggest maybe this is how 452 00:17:58,530 --> 00:18:00,380 we are sort of the same thing. 453 00:18:00,380 --> 00:18:01,630 JUSTIN CURRY: OK. 454 00:18:01,630 --> 00:18:05,260 So possibly, right? 455 00:18:05,260 --> 00:18:06,890 I'm not sure. 456 00:18:06,890 --> 00:18:09,880 But here's what I kind of came up with. 457 00:18:09,880 --> 00:18:12,220 And it's my motivation for why I think people might 458 00:18:12,220 --> 00:18:14,060 want to study computer science. 459 00:18:14,060 --> 00:18:16,870 This will kind of launch into a whole little lecture. 460 00:18:16,870 --> 00:18:19,165 A what do you want to do with your life kind of thing. 461 00:18:19,165 --> 00:18:21,290 I'm not going to try to solve that problem for you, 462 00:18:21,290 --> 00:18:23,800 but we can try. 463 00:18:23,800 --> 00:18:26,500 So what I think is cool about computers 464 00:18:26,500 --> 00:18:30,604 is that back in the day-- 465 00:18:30,604 --> 00:18:32,020 if you want to go all the way back 466 00:18:32,020 --> 00:18:35,715 to Babbage and these guys in the late 1800s-- 467 00:18:35,715 --> 00:18:37,840 they were just playing around with gears and things 468 00:18:37,840 --> 00:18:38,530 like this. 469 00:18:38,530 --> 00:18:41,271 So I mean, they were working with just physical level. 470 00:18:41,271 --> 00:18:41,770 Right? 471 00:18:41,770 --> 00:18:46,840 So we had this physics governing everything. 472 00:18:46,840 --> 00:18:51,820 And he was playing with gears and wheels. 473 00:18:51,820 --> 00:18:54,590 But what really makes up today is 474 00:18:54,590 --> 00:19:01,000 we have transistors and capacitors. 475 00:19:01,000 --> 00:19:05,110 We have electronic things for doing 476 00:19:05,110 --> 00:19:07,960 really small computations. 477 00:19:07,960 --> 00:19:10,870 Like Babbage was thrilled when he could just get 478 00:19:10,870 --> 00:19:12,220 a simple calculating machine. 479 00:19:12,220 --> 00:19:13,570 Like a-ha, wow! 480 00:19:13,570 --> 00:19:14,824 Now you can do logarithms. 481 00:19:14,824 --> 00:19:15,490 This is amazing. 482 00:19:18,100 --> 00:19:21,400 But he was programming at this level, right? 483 00:19:21,400 --> 00:19:23,710 His levels of description are almost completely based 484 00:19:23,710 --> 00:19:27,430 on gears, wheels, and then what became, in the 1900s-- 485 00:19:27,430 --> 00:19:28,720 early 1900s-- 486 00:19:28,720 --> 00:19:31,780 transistors and capacitors. 487 00:19:31,780 --> 00:19:36,540 But then we had this, I would suppose, 488 00:19:36,540 --> 00:19:42,060 this kind of revolution in how we do things. 489 00:19:42,060 --> 00:19:45,330 And we got up to the level of, to speak 490 00:19:45,330 --> 00:19:51,710 in modern terms, motherboards and video cards, 491 00:19:51,710 --> 00:19:53,880 and things like this. 492 00:19:53,880 --> 00:19:56,550 And we went ahead and started abstracting 493 00:19:56,550 --> 00:19:59,290 entire chunks of hardware to do very specific, 494 00:19:59,290 --> 00:20:00,880 highly designed things. 495 00:20:00,880 --> 00:20:03,088 And we just started playing around with those chunks. 496 00:20:03,088 --> 00:20:04,110 Like, a-ha! 497 00:20:04,110 --> 00:20:06,420 Now I don't have as much lag in my video games. 498 00:20:06,420 --> 00:20:07,920 I can have a new video card. 499 00:20:07,920 --> 00:20:09,960 Pop that in. 500 00:20:09,960 --> 00:20:12,310 So we no longer have-- 501 00:20:12,310 --> 00:20:16,630 I mean, how many people, at least popular computer users, 502 00:20:16,630 --> 00:20:19,140 ever worry about the transistors and capacitors 503 00:20:19,140 --> 00:20:19,890 in their computer? 504 00:20:19,890 --> 00:20:21,620 Like, oh, no. 505 00:20:21,620 --> 00:20:23,840 It doesn't happen. 506 00:20:23,840 --> 00:20:26,750 And then we have this kind of next step up of-- 507 00:20:26,750 --> 00:20:29,930 and this is really where the realm of computer science lives 508 00:20:29,930 --> 00:20:37,640 is software and operating systems, which is going 509 00:20:37,640 --> 00:20:40,240 to live on top of this, even. 510 00:20:40,240 --> 00:20:42,002 And this is amazing. 511 00:20:42,002 --> 00:20:43,460 Suddenly, we're dealing with things 512 00:20:43,460 --> 00:20:45,200 which don't really exist. 513 00:20:45,200 --> 00:20:45,800 Right? 514 00:20:45,800 --> 00:20:48,610 We're kind of executing spells. 515 00:20:48,610 --> 00:20:52,100 When Curran gets up later today and he hacks away 516 00:20:52,100 --> 00:20:53,250 on his computer-- 517 00:20:53,250 --> 00:20:54,440 type, type, type, type. 518 00:20:54,440 --> 00:20:56,990 He just puts in these little images 519 00:20:56,990 --> 00:20:59,750 that show up on the screen, and then he hits an enter button. 520 00:20:59,750 --> 00:21:01,700 And then it suddenly executes magic, 521 00:21:01,700 --> 00:21:04,670 and it exists in a different realm. 522 00:21:04,670 --> 00:21:07,730 But it all boils down to what's happening 523 00:21:07,730 --> 00:21:10,520 on this level of transistors and capacitors. 524 00:21:10,520 --> 00:21:13,640 And the ability to actually tell you-- 525 00:21:13,640 --> 00:21:18,500 if you went to somebody and you said, "Hi, my name's x. 526 00:21:18,500 --> 00:21:21,800 Can you tell me how Windows works, in terms 527 00:21:21,800 --> 00:21:25,230 of capacitors and transistors?" 528 00:21:25,230 --> 00:21:28,870 They would just kind of go, [GROAN] 529 00:21:28,870 --> 00:21:31,270 do you want to enroll in school for the next four years? 530 00:21:31,270 --> 00:21:33,600 Like, sure. 531 00:21:33,600 --> 00:21:35,440 And that's really the project here. 532 00:21:35,440 --> 00:21:40,390 But what's interesting is that there's a conceptual framework 533 00:21:40,390 --> 00:21:42,550 we have for thinking here, which is 534 00:21:42,550 --> 00:21:46,820 very close to how we hope to crack the problem of the mind. 535 00:21:46,820 --> 00:21:55,750 And that's the idea that we have neurons and neurotransmitters-- 536 00:21:55,750 --> 00:22:01,060 things ultimately govern transmitters. 537 00:22:01,060 --> 00:22:04,770 Sorry, I ran out of space, here. 538 00:22:04,770 --> 00:22:07,630 We describe things in terms of cable theory, 539 00:22:07,630 --> 00:22:11,320 like we're just sending messages down an insulated cable. 540 00:22:11,320 --> 00:22:14,050 We have this kind of level of description 541 00:22:14,050 --> 00:22:18,400 for the brain at this very small level-- 542 00:22:18,400 --> 00:22:21,430 and this is already research worthy. 543 00:22:21,430 --> 00:22:24,980 But then we have a bunch of biologists and such, and people 544 00:22:24,980 --> 00:22:27,070 who are interested in neuroanatomy, talking 545 00:22:27,070 --> 00:22:30,490 about, oh, what about the amygdala? 546 00:22:30,490 --> 00:22:32,560 I don't know if I'm spelling any of this right. 547 00:22:32,560 --> 00:22:35,060 The hippocampus. 548 00:22:35,060 --> 00:22:38,970 I mean, your front cortex, your cerebral cortex. 549 00:22:38,970 --> 00:22:41,580 And I'll just denote by cortices. 550 00:22:41,580 --> 00:22:44,717 I don't know if that's the correct pluralization. 551 00:22:47,700 --> 00:22:51,150 And then somehow, out of these larger structures, 552 00:22:51,150 --> 00:22:53,700 where you abstract away the details of what's 553 00:22:53,700 --> 00:22:55,766 going on with the transcription in this cell, 554 00:22:55,766 --> 00:22:57,390 and instead you're just thinking about, 555 00:22:57,390 --> 00:23:02,200 how is my hippocampus doing today? 556 00:23:02,200 --> 00:23:04,640 We then all the way enter essentially 557 00:23:04,640 --> 00:23:06,540 an isomorphic level of software in [? OS, ?] 558 00:23:06,540 --> 00:23:10,375 which is the psyche, the mind. 559 00:23:15,170 --> 00:23:20,450 And I'll go and put it up here-- soul. 560 00:23:20,450 --> 00:23:25,200 And this is fundamentally why artificial intelligence 561 00:23:25,200 --> 00:23:28,430 researchers are found in the computer science department. 562 00:23:28,430 --> 00:23:30,930 It's because we've had a lot of guys who have gone from this 563 00:23:30,930 --> 00:23:37,020 level of dealing with very clunky physics to doing really 564 00:23:37,020 --> 00:23:41,610 souped up stuff on larger physical entities-- 565 00:23:41,610 --> 00:23:43,110 motherboards, video cards-- 566 00:23:43,110 --> 00:23:48,130 and then getting up to the level of software systems 567 00:23:48,130 --> 00:23:51,300 and operating systems. 568 00:23:51,300 --> 00:23:54,050 But I wanted to take a quick poll. 569 00:23:54,050 --> 00:23:56,150 Where do your guys' interests lie? 570 00:23:56,150 --> 00:24:00,620 On this kind of stack, in both camps here, 571 00:24:00,620 --> 00:24:05,870 but especially focusing on the brain and going into the mind. 572 00:24:05,870 --> 00:24:08,150 What do you guys find most interesting? 573 00:24:08,150 --> 00:24:11,630 This is going to really reveal a lot about who you are 574 00:24:11,630 --> 00:24:12,695 and what you want to do. 575 00:24:12,695 --> 00:24:14,070 What do you find most interesting 576 00:24:14,070 --> 00:24:15,860 in these three boxes. 577 00:24:15,860 --> 00:24:19,062 If you had to vote for one box, what would it be? 578 00:24:19,062 --> 00:24:21,640 AUDIENCE: [INAUDIBLE] 579 00:24:21,640 --> 00:24:23,380 JUSTIN CURRY: You want two boxes? 580 00:24:23,380 --> 00:24:24,580 I can't give you two boxes. 581 00:24:24,580 --> 00:24:25,540 You've got to decide. 582 00:24:25,540 --> 00:24:28,240 I mean, otherwise, you're going to be in school forever, right? 583 00:24:28,240 --> 00:24:30,820 All right, two boxes, but that means you automatically 584 00:24:30,820 --> 00:24:33,028 have to go to school for seven years after undergrad. 585 00:24:33,028 --> 00:24:35,510 All right? 586 00:24:35,510 --> 00:24:38,049 OK, go ahead. 587 00:24:38,049 --> 00:24:38,840 Give me some votes. 588 00:24:38,840 --> 00:24:40,370 I want to see some hands. 589 00:24:40,370 --> 00:24:44,360 Who likes studying molecules, neurons, neurotransmitters, 590 00:24:44,360 --> 00:24:46,490 maybe even basic physics? 591 00:24:46,490 --> 00:24:51,210 OK, Sandra, Ders, Maya, Rishi. 592 00:24:51,210 --> 00:24:53,040 OK. 593 00:24:53,040 --> 00:24:55,370 Who's kind of the inherent biology here, 594 00:24:55,370 --> 00:25:00,410 who likes amygdalas, hippocampus, set of cortices? 595 00:25:00,410 --> 00:25:02,656 All right, I got Atif. 596 00:25:02,656 --> 00:25:04,280 All right, now who's interested in kind 597 00:25:04,280 --> 00:25:07,170 of the psyche, the mind, the soul? 598 00:25:07,170 --> 00:25:10,070 Who wants to actually sit in an office and deal with people's 599 00:25:10,070 --> 00:25:11,700 problems rather than-- 600 00:25:11,700 --> 00:25:13,140 or maybe not. 601 00:25:13,140 --> 00:25:13,920 I warn you. 602 00:25:13,920 --> 00:25:14,920 You don't want to deal-- 603 00:25:14,920 --> 00:25:17,880 OK, so got Navine, I'm sorry, I forget your name, again. 604 00:25:17,880 --> 00:25:18,860 AUDIENCE: Vivian. 605 00:25:18,860 --> 00:25:19,200 JUSTIN CURRY: Say again? 606 00:25:19,200 --> 00:25:19,810 AUDIENCE: Vivian. 607 00:25:19,810 --> 00:25:20,685 JUSTIN CURRY: Vivian. 608 00:25:20,685 --> 00:25:24,720 Vivian, Maya, Atif, and Felix. 609 00:25:24,720 --> 00:25:26,871 OK, cool. 610 00:25:26,871 --> 00:25:28,370 AUDIENCE: I have another one to add. 611 00:25:28,370 --> 00:25:29,930 JUSTIN CURRY: OK, go for it. 612 00:25:29,930 --> 00:25:33,170 AUDIENCE: Who wants to understand the commonalities 613 00:25:33,170 --> 00:25:35,850 between all of these levels? 614 00:25:35,850 --> 00:25:38,100 JUSTIN CURRY: And we're getting a lot of hands, there. 615 00:25:38,100 --> 00:25:39,900 All right, so who wants to understand the commonality 616 00:25:39,900 --> 00:25:40,774 between all of these? 617 00:25:40,774 --> 00:25:42,300 All right, and this is-- 618 00:25:45,360 --> 00:25:49,020 Curran and I put together this list. 619 00:25:49,020 --> 00:25:52,590 And he'll show you when he comes up and gets the projector 620 00:25:52,590 --> 00:25:55,510 rolling. 621 00:25:55,510 --> 00:26:02,990 But it's this idea of we've found ourselves working 622 00:26:02,990 --> 00:26:06,890 with problems in the systems which are so complex that we 623 00:26:06,890 --> 00:26:11,330 can only really occupy ourselves with certain layers 624 00:26:11,330 --> 00:26:12,890 of abstraction. 625 00:26:12,890 --> 00:26:15,980 But the interesting thing is that we can abstract, right? 626 00:26:15,980 --> 00:26:20,395 Just like when we're modding out our computer, 627 00:26:20,395 --> 00:26:22,520 we don't ever have to really care about transistors 628 00:26:22,520 --> 00:26:25,370 and capacitors, right? 629 00:26:25,370 --> 00:26:29,780 And similarly, when I'm playing around on MATLAB, 630 00:26:29,780 --> 00:26:32,036 I don't ever have to worry about-- 631 00:26:32,036 --> 00:26:34,130 well, sometimes I have to worry about memory use. 632 00:26:34,130 --> 00:26:35,510 But I don't usually have to worry 633 00:26:35,510 --> 00:26:36,740 about any physical details. 634 00:26:36,740 --> 00:26:37,510 Yes? 635 00:26:37,510 --> 00:26:40,010 AUDIENCE: You can't have one without the other, right? 636 00:26:40,010 --> 00:26:41,218 JUSTIN CURRY: Right, exactly. 637 00:26:41,218 --> 00:26:43,670 As Sandra says, you can't have one without the other. 638 00:26:43,670 --> 00:26:45,230 Very important, right? 639 00:26:45,230 --> 00:26:50,780 Very, very important is you do have a tower here, right? 640 00:26:50,780 --> 00:26:54,710 But what's scary maybe-- maybe scary-- 641 00:26:54,710 --> 00:26:56,540 is the idea that some of these things 642 00:26:56,540 --> 00:26:59,270 are universal and are independent. 643 00:26:59,270 --> 00:26:59,770 Atif? 644 00:26:59,770 --> 00:27:03,230 AUDIENCE: [INAUDIBLE] you can have the physical things 645 00:27:03,230 --> 00:27:05,530 that the soul organizes [INAUDIBLE].. 646 00:27:05,530 --> 00:27:07,020 JUSTIN CURRY: OK, so definitely. 647 00:27:07,020 --> 00:27:08,710 These arrows only go one direction. 648 00:27:08,710 --> 00:27:13,460 You can have the physical thing without the mental, right? 649 00:27:13,460 --> 00:27:15,260 We have plenty of trees and things 650 00:27:15,260 --> 00:27:17,720 who don't have Windows Vista running on them. 651 00:27:17,720 --> 00:27:18,220 Go ahead. 652 00:27:18,220 --> 00:27:19,845 AUDIENCE: What do the arrows represent? 653 00:27:19,845 --> 00:27:23,060 JUSTIN CURRY: What do the arrows represent? 654 00:27:23,060 --> 00:27:25,114 Do you have to ask me that question? 655 00:27:25,114 --> 00:27:26,752 AUDIENCE: No, but-- well, I don't know 656 00:27:26,752 --> 00:27:27,918 if it's true with the arrow. 657 00:27:27,918 --> 00:27:29,252 But there was maybe [INAUDIBLE]. 658 00:27:29,252 --> 00:27:31,043 JUSTIN CURRY: All right, so that's actually 659 00:27:31,043 --> 00:27:32,260 a very good question. 660 00:27:32,260 --> 00:27:33,776 That's a very good question. 661 00:27:33,776 --> 00:27:34,900 So what do the arrows mean? 662 00:27:37,470 --> 00:27:42,680 So the simple answer would be they indicate, I guess, 663 00:27:42,680 --> 00:27:47,424 derivability or dependence, except the dependency is 664 00:27:47,424 --> 00:27:48,590 kind of going the other way. 665 00:27:48,590 --> 00:27:50,020 Right? 666 00:27:50,020 --> 00:27:51,580 Motherboards and video cards depend 667 00:27:51,580 --> 00:27:54,169 upon their underlying transistors, capacitors, 668 00:27:54,169 --> 00:27:56,460 but the arrow is kind of pointing up them towards them. 669 00:27:56,460 --> 00:27:59,140 It's this idea that you can build things out of it, right? 670 00:27:59,140 --> 00:28:01,360 And this is an important idea, is 671 00:28:01,360 --> 00:28:03,940 that once you've got a basic amount of tools 672 00:28:03,940 --> 00:28:07,020 available to you, you can kind of start building upwards. 673 00:28:07,020 --> 00:28:07,570 Right? 674 00:28:07,570 --> 00:28:08,856 Like once you've got-- 675 00:28:08,856 --> 00:28:10,480 and this is, I think, very interesting, 676 00:28:10,480 --> 00:28:16,000 and it's a good concept of study for languages. 677 00:28:16,000 --> 00:28:18,580 So when you're learning Spanish or any other language, 678 00:28:18,580 --> 00:28:21,580 what's perhaps the most useful phrase that you will ever 679 00:28:21,580 --> 00:28:23,987 learn in Spanish? 680 00:28:23,987 --> 00:28:25,082 AUDIENCE: Hola. 681 00:28:25,082 --> 00:28:26,040 JUSTIN CURRY: OK, hola. 682 00:28:26,040 --> 00:28:26,240 Why? 683 00:28:26,240 --> 00:28:27,040 Why is that useful? 684 00:28:27,040 --> 00:28:27,760 You can say hi. 685 00:28:27,760 --> 00:28:29,105 Big deal. 686 00:28:29,105 --> 00:28:30,730 What if you want to learn more Spanish? 687 00:28:30,730 --> 00:28:32,590 What's the best phrase you can learn? 688 00:28:32,590 --> 00:28:34,631 And I'm sorry, I don't know many other languages. 689 00:28:34,631 --> 00:28:36,872 AUDIENCE: [INAUDIBLE]. 690 00:28:36,872 --> 00:28:37,580 JUSTIN CURRY: OK. 691 00:28:40,570 --> 00:28:41,070 OK. 692 00:28:41,070 --> 00:28:42,360 Como se dice. 693 00:28:42,360 --> 00:28:43,480 Excellent. 694 00:28:43,480 --> 00:28:44,830 How do you say this? 695 00:28:44,830 --> 00:28:48,430 Right, in some way, with those three words, 696 00:28:48,430 --> 00:28:51,160 you've opened yourself up to an entire world of a language. 697 00:28:51,160 --> 00:28:52,640 Right? 698 00:28:52,640 --> 00:28:55,730 Como se dice-- oh, libro. 699 00:28:55,730 --> 00:28:57,120 OK. 700 00:28:57,120 --> 00:28:59,770 Como se dice-- bolĂ­grafo. 701 00:28:59,770 --> 00:29:00,400 Oh, wonderful. 702 00:29:00,400 --> 00:29:01,390 Fantastic. 703 00:29:01,390 --> 00:29:04,150 So suddenly, you have this prompting for the outside world 704 00:29:04,150 --> 00:29:06,040 just by you learning three words. 705 00:29:06,040 --> 00:29:09,760 And you could kind of learn the entire language. 706 00:29:09,760 --> 00:29:11,170 And that's an important thing. 707 00:29:11,170 --> 00:29:14,620 Once you've got some basic elements, some building 708 00:29:14,620 --> 00:29:20,770 blocks of words and vocabulary, you can immediately 709 00:29:20,770 --> 00:29:24,760 then consider this lofty layer of what 710 00:29:24,760 --> 00:29:29,920 happens when you put this word-- and so let's go to A, B, C. So 711 00:29:29,920 --> 00:29:34,810 what happens when we have just A together, B by itself, 712 00:29:34,810 --> 00:29:42,160 or we could have AB, or we could have AC, and then BC. 713 00:29:42,160 --> 00:29:46,030 We already have this idea of once you've got some things, 714 00:29:46,030 --> 00:29:49,486 you can take the power set of them, and get-- oh, I'm sorry. 715 00:29:49,486 --> 00:29:50,860 I'm forgetting something crucial. 716 00:29:54,080 --> 00:29:56,620 And you suddenly have a whole layer of complexity 717 00:29:56,620 --> 00:30:00,970 just by considering the meta level beyond your first layer 718 00:30:00,970 --> 00:30:01,730 here. 719 00:30:01,730 --> 00:30:04,150 So you learn a handful of Spanish words, right? 720 00:30:04,150 --> 00:30:06,280 And then just by considering their combinations 721 00:30:06,280 --> 00:30:10,404 and permutations, and ability to prompt the outside world, 722 00:30:10,404 --> 00:30:11,820 you can suddenly feed into a whole 723 00:30:11,820 --> 00:30:15,430 another layer of abstraction, another step up. 724 00:30:15,430 --> 00:30:16,990 It's the same thing with here. 725 00:30:16,990 --> 00:30:20,500 So say you've got one transistor. 726 00:30:20,500 --> 00:30:21,900 OK, good for you. 727 00:30:21,900 --> 00:30:24,710 But suppose you have a hundred of them, right? 728 00:30:24,710 --> 00:30:26,490 You can start doing stuff, suddenly. 729 00:30:26,490 --> 00:30:30,190 And you can start doing stuff and have things which emerge, 730 00:30:30,190 --> 00:30:32,650 which are greater than just the individual parts. 731 00:30:35,880 --> 00:30:38,650 I mentioned briefly this idea of ant societies, 732 00:30:38,650 --> 00:30:42,780 of ant societies exhibiting these kind 733 00:30:42,780 --> 00:30:45,360 of emergent properties, which are greater than any single one 734 00:30:45,360 --> 00:30:46,860 ant. 735 00:30:46,860 --> 00:30:49,260 And this is an important idea, especially when you even 736 00:30:49,260 --> 00:30:50,970 think about humans. 737 00:30:50,970 --> 00:30:53,220 Like, humans always do this. 738 00:30:53,220 --> 00:30:55,710 One person has an idea and another person 739 00:30:55,710 --> 00:30:57,750 has another idea, and then you can kind of 740 00:30:57,750 --> 00:31:00,700 consider, what's the interface of these two ideas? 741 00:31:00,700 --> 00:31:02,185 And you've got another idea. 742 00:31:02,185 --> 00:31:03,810 Like, you can suddenly build things out 743 00:31:03,810 --> 00:31:06,890 of these simple pieces. 744 00:31:06,890 --> 00:31:11,270 And I suppose the merit of studying 745 00:31:11,270 --> 00:31:15,260 this is not only do you understand when can you forget 746 00:31:15,260 --> 00:31:25,950 about the details, abstraction, but what can you do, 747 00:31:25,950 --> 00:31:27,270 given the pieces that you have? 748 00:31:30,420 --> 00:31:36,400 so I think this is kind of an important piece. 749 00:31:36,400 --> 00:31:41,680 Tied into this, though, is this idea of levels of description. 750 00:31:41,680 --> 00:31:44,500 How do you describe something based on what level 751 00:31:44,500 --> 00:31:46,600 of even ourselves? 752 00:31:46,600 --> 00:31:51,792 I'm sure many of you ran across the part in chapter 10, 753 00:31:51,792 --> 00:31:54,250 where Hofstadter talks about the paranoid and the operating 754 00:31:54,250 --> 00:31:57,250 system, and he talks about this program, Perry. 755 00:31:57,250 --> 00:31:59,710 And he talks about the situation. 756 00:31:59,710 --> 00:32:03,700 And he kind of plays out-- and I'll go ahead and elaborate 757 00:32:03,700 --> 00:32:05,578 stuff that wasn't in the book. 758 00:32:05,578 --> 00:32:07,702 But I think it's an interesting thought experiment. 759 00:32:10,940 --> 00:32:15,770 So how many of you would identify your bodies 760 00:32:15,770 --> 00:32:18,420 as part of yourself? 761 00:32:18,420 --> 00:32:24,210 Sandra, Navine, Ders, Atif, Felix, Rishi. 762 00:32:24,210 --> 00:32:24,990 OK. 763 00:32:24,990 --> 00:32:27,390 So wait, you guys didn't raise your hand, right? 764 00:32:27,390 --> 00:32:31,050 So you don't identify your bodies with yourself. 765 00:32:31,050 --> 00:32:33,772 OK, so Atif changed his mind. 766 00:32:33,772 --> 00:32:34,980 So for those of you who did-- 767 00:32:34,980 --> 00:32:37,386 AUDIENCE: Well, is the brain part of my body? 768 00:32:37,386 --> 00:32:39,260 JUSTIN CURRY: Is the brain part of your body? 769 00:32:39,260 --> 00:32:40,080 Yeah. 770 00:32:40,080 --> 00:32:41,120 AUDIENCE: OK, then yeah. 771 00:32:41,120 --> 00:32:43,245 JUSTIN CURRY: OK, so the brain's part of your body. 772 00:32:43,245 --> 00:32:44,850 Anyone want to change the votes? 773 00:32:44,850 --> 00:32:45,480 OK, fine. 774 00:32:49,020 --> 00:32:52,140 If I were to ask you, how's your leg feeling today? 775 00:32:54,840 --> 00:32:57,750 How many people would think that's a normal question? 776 00:32:57,750 --> 00:32:58,380 OK. 777 00:32:58,380 --> 00:32:59,480 So we've got plenty of normal questions. 778 00:32:59,480 --> 00:33:00,480 AUDIENCE: [INAUDIBLE] 779 00:33:00,480 --> 00:33:03,060 JUSTIN CURRY: All right. 780 00:33:03,060 --> 00:33:05,535 So suppose I then asked the question which Hofstadter 781 00:33:05,535 --> 00:33:12,540 asked, is like, so why are you making so few red blood cells 782 00:33:12,540 --> 00:33:14,850 today? 783 00:33:14,850 --> 00:33:16,080 Is that a normal question? 784 00:33:16,080 --> 00:33:17,038 Yeah, go ahead, Sandra. 785 00:33:17,038 --> 00:33:17,720 AUDIENCE: Yeah. 786 00:33:17,720 --> 00:33:18,510 JUSTIN CURRY: You think it's a normal question? 787 00:33:18,510 --> 00:33:19,025 AUDIENCE: Yeah. 788 00:33:19,025 --> 00:33:20,490 JUSTIN CURRY: Do you have control over how many 789 00:33:20,490 --> 00:33:21,516 red blood cells? 790 00:33:21,516 --> 00:33:22,514 AUDIENCE: No. 791 00:33:22,514 --> 00:33:24,430 JUSTIN CURRY: But they're a part of your self. 792 00:33:24,430 --> 00:33:24,930 AUDIENCE: It's really a part of me. 793 00:33:24,930 --> 00:33:26,610 JUSTIN CURRY: It's part of you. 794 00:33:26,610 --> 00:33:28,860 So why can't I address your red blood cell count 795 00:33:28,860 --> 00:33:29,812 as part of you? 796 00:33:29,812 --> 00:33:31,270 AUDIENCE: Well, it's inside of you. 797 00:33:31,270 --> 00:33:33,191 JUSTIN CURRY: It's inside of you, right? 798 00:33:33,191 --> 00:33:33,690 But-- 799 00:33:33,690 --> 00:33:36,349 AUDIENCE: [INAUDIBLE] very [INAUDIBLE] 800 00:33:36,349 --> 00:33:37,890 JUSTIN CURRY: OK, so you're accepting 801 00:33:37,890 --> 00:33:40,098 that as part of the logical consequence of the things 802 00:33:40,098 --> 00:33:40,960 you've said. 803 00:33:40,960 --> 00:33:41,460 All right. 804 00:33:41,460 --> 00:33:42,970 Well, good for you, because you're being consistent. 805 00:33:42,970 --> 00:33:45,090 Does anyone else suddenly feel like I've 806 00:33:45,090 --> 00:33:47,315 said something awkward? 807 00:33:47,315 --> 00:33:48,690 AUDIENCE: Maybe we shouldn't even 808 00:33:48,690 --> 00:33:50,197 say, how does your leg feel today. 809 00:33:50,197 --> 00:33:52,780 JUSTIN CURRY: Maybe we shouldn't even say, how does your leg-- 810 00:33:52,780 --> 00:33:54,104 so if we never address-- 811 00:33:54,104 --> 00:33:56,020 AUDIENCE: [INAUDIBLE] deciding which is which? 812 00:33:56,020 --> 00:33:56,853 JUSTIN CURRY: Right. 813 00:33:56,853 --> 00:33:59,220 So I suppose, I don't know, Curran comes in 814 00:33:59,220 --> 00:34:01,460 and he's just got two black eyes. 815 00:34:01,460 --> 00:34:03,920 And I'm like, Curran, you look terrible! 816 00:34:03,920 --> 00:34:05,820 And he's like, that's just not an acceptable 817 00:34:05,820 --> 00:34:08,278 question, because I'm referring to his physical appearance. 818 00:34:08,278 --> 00:34:10,520 Right? 819 00:34:10,520 --> 00:34:13,159 AUDIENCE: And it's respecting his physical feelings, 820 00:34:13,159 --> 00:34:16,310 his physical things, so he's almost 821 00:34:16,310 --> 00:34:20,120 internalizing that picture and saying, oh, yeah, that's me. 822 00:34:20,120 --> 00:34:23,420 JUSTIN CURRY: OK, so you're saying Curran's self-model 823 00:34:23,420 --> 00:34:26,384 is inherently dependent on his body and his interaction 824 00:34:26,384 --> 00:34:27,300 with his environments. 825 00:34:27,300 --> 00:34:27,860 Right? 826 00:34:27,860 --> 00:34:29,630 So then when I say, you look terrible, 827 00:34:29,630 --> 00:34:33,320 he used his own internal model and looks at himself 828 00:34:33,320 --> 00:34:35,420 as being terrible. 829 00:34:35,420 --> 00:34:37,679 So then that does make sense. 830 00:34:37,679 --> 00:34:40,254 But then what about the red blood cell question? 831 00:34:40,254 --> 00:34:41,630 AUDIENCE: Red blood cells-- 832 00:34:41,630 --> 00:34:44,330 I mean, usually you do not have kind of internal model 833 00:34:44,330 --> 00:34:45,636 of your own red blood cells. 834 00:34:45,636 --> 00:34:46,719 JUSTIN CURRY: Yeah, right. 835 00:34:46,719 --> 00:34:48,427 You don't normally have an internal model 836 00:34:48,427 --> 00:34:50,070 of your red blood cells, right? 837 00:34:50,070 --> 00:34:51,650 But suppose you did. 838 00:34:51,650 --> 00:34:53,590 And suppose that-- and this is kind 839 00:34:53,590 --> 00:34:57,210 going to be one one of the take home 840 00:34:57,210 --> 00:35:01,110 interesting consequences of this thought experiment 841 00:35:01,110 --> 00:35:05,750 is what if you had internal access to everything that 842 00:35:05,750 --> 00:35:07,255 was going on? 843 00:35:07,255 --> 00:35:08,630 AUDIENCE: That would be terrible. 844 00:35:08,630 --> 00:35:10,070 JUSTIN CURRY: OK so everyone's going, oh, god, 845 00:35:10,070 --> 00:35:10,946 it would be terrible. 846 00:35:10,946 --> 00:35:12,320 I mean, why would it be terrible? 847 00:35:12,320 --> 00:35:14,303 Like, I'd be trying to walk across this room-- 848 00:35:14,303 --> 00:35:15,677 AUDIENCE: Because some choices we 849 00:35:15,677 --> 00:35:18,889 make are not necessarily the right or appropriate ones. 850 00:35:18,889 --> 00:35:19,930 JUSTIN CURRY: Yes, some-- 851 00:35:19,930 --> 00:35:25,486 AUDIENCE: [INAUDIBLE] red blood cells stop me [INAUDIBLE].. 852 00:35:25,486 --> 00:35:28,850 The first chapter identified what's making the decisions, 853 00:35:28,850 --> 00:35:32,690 that perhaps the conscious thing doesn't make decisions. 854 00:35:32,690 --> 00:35:35,990 But some, the decisions it's aware that are being made, 855 00:35:35,990 --> 00:35:39,790 it ascribes to itself. 856 00:35:39,790 --> 00:35:42,340 JUSTIN CURRY: OK, repeat that last line one more time. 857 00:35:42,340 --> 00:35:45,420 AUDIENCE: The decisions that it's aware are being made, 858 00:35:45,420 --> 00:35:48,670 it ascribes as being made by itself. 859 00:35:48,670 --> 00:35:51,550 JUSTIN CURRY: OK, so you make certain decisions 860 00:35:51,550 --> 00:35:54,505 and you think that you're making those decisions. 861 00:35:54,505 --> 00:35:56,630 AUDIENCE: I always ask, who's making the decisions? 862 00:35:56,630 --> 00:35:58,480 My neurons or myself. 863 00:35:58,480 --> 00:35:59,532 JUSTIN CURRY: All right. 864 00:35:59,532 --> 00:36:01,240 And I really wish I could have passed out 865 00:36:01,240 --> 00:36:06,250 the dialogue, like who pushes whom around the cranium. 866 00:36:06,250 --> 00:36:09,146 One of Hofstadter's other dialogues. 867 00:36:09,146 --> 00:36:10,330 And you're right. 868 00:36:10,330 --> 00:36:12,314 And I wanted to read this dialogue, 869 00:36:12,314 --> 00:36:14,230 but I don't know if we'll have time for today. 870 00:36:14,230 --> 00:36:17,470 But the prelude and ant fugue gets this idea of symbols, 871 00:36:17,470 --> 00:36:20,860 of this kind of larger, emerging groups of neurons 872 00:36:20,860 --> 00:36:23,110 and little internal representations 873 00:36:23,110 --> 00:36:24,700 of the outside world. 874 00:36:24,700 --> 00:36:28,990 And to a certain extent, they make up who we are. 875 00:36:28,990 --> 00:36:33,190 Our internal models are the ones that then collectively make 876 00:36:33,190 --> 00:36:34,570 decisions. 877 00:36:34,570 --> 00:36:37,270 But on the other hand, we then have 878 00:36:37,270 --> 00:36:42,700 times where the emergent thing then 879 00:36:42,700 --> 00:36:45,550 manipulates our internal models of the world. 880 00:36:45,550 --> 00:36:47,590 So suppose you're sitting in this class one day, 881 00:36:47,590 --> 00:36:49,750 and I'm telling you, well, the universe actually 882 00:36:49,750 --> 00:36:50,792 isn't deterministic. 883 00:36:50,792 --> 00:36:52,750 It's actually got this probabilistic framework, 884 00:36:52,750 --> 00:36:53,666 the quantum mechanics. 885 00:36:53,666 --> 00:36:55,250 So then you go, wow. 886 00:36:55,250 --> 00:36:58,300 And then you change your base little mental model 887 00:36:58,300 --> 00:37:00,310 of the world. 888 00:37:00,310 --> 00:37:02,890 So I mean, there's this kind of weird interaction 889 00:37:02,890 --> 00:37:04,150 with hierarchies. 890 00:37:04,150 --> 00:37:08,810 So this is an interesting point, because here, there 891 00:37:08,810 --> 00:37:13,300 is very little interplay going down this way. 892 00:37:13,300 --> 00:37:14,050 But you know what? 893 00:37:14,050 --> 00:37:15,820 It does happen, right? 894 00:37:15,820 --> 00:37:19,570 I think one of the best examples is your cell phone. 895 00:37:19,570 --> 00:37:25,120 Because you have this SIM card as part of your cell phone, 896 00:37:25,120 --> 00:37:26,457 which actually-- 897 00:37:26,457 --> 00:37:27,040 yeah, exactly. 898 00:37:27,040 --> 00:37:30,370 I mean, you take it out and it might not be operational. 899 00:37:30,370 --> 00:37:34,900 But you could also try putting it into another cell phone 900 00:37:34,900 --> 00:37:36,762 and it might not work. 901 00:37:36,762 --> 00:37:39,220 And that's, in part, ascribed to this higher service, which 902 00:37:39,220 --> 00:37:41,200 is getting beaten back down. 903 00:37:41,200 --> 00:37:43,300 There is a weird interplay of levels, of software, 904 00:37:43,300 --> 00:37:44,380 and hardware going on in here. 905 00:37:44,380 --> 00:37:46,560 AUDIENCE: So suppose we have like-- the brain really 906 00:37:46,560 --> 00:37:52,132 had perhaps your lower level neurons, and neurons talk 907 00:37:52,132 --> 00:37:54,340 about those neurons that talk about those [? sort. ?] 908 00:37:54,340 --> 00:37:58,850 You're not having this crazy software view. 909 00:37:58,850 --> 00:38:02,280 Maybe the consciousness that's not even the physical, 910 00:38:02,280 --> 00:38:04,320 is actually attracting that [INAUDIBLE],, 911 00:38:04,320 --> 00:38:06,069 like making changes to the physical world. 912 00:38:06,069 --> 00:38:08,140 It's like, the physical world is making changes 913 00:38:08,140 --> 00:38:09,340 to the physical world. 914 00:38:09,340 --> 00:38:12,400 But those higher level neurons are sort of the ones 915 00:38:12,400 --> 00:38:14,020 representing the abstractions. 916 00:38:14,020 --> 00:38:15,520 The abstractions don't really exist, 917 00:38:15,520 --> 00:38:18,850 but it's like these neurons are taking something 918 00:38:18,850 --> 00:38:20,440 from the lower level neurons. 919 00:38:20,440 --> 00:38:23,070 And they're seeing the essential processes of them, 920 00:38:23,070 --> 00:38:24,950 and then they're [? controlling ?] them back. 921 00:38:24,950 --> 00:38:28,679 So you are mostly your higher level neurons. 922 00:38:28,679 --> 00:38:30,220 JUSTIN CURRY: All right, so maybe you 923 00:38:30,220 --> 00:38:31,761 are mostly your higher level neurons. 924 00:38:31,761 --> 00:38:34,620 But I think an interesting example-- 925 00:38:34,620 --> 00:38:38,250 and we're getting into details of the brain, which I'm not 926 00:38:38,250 --> 00:38:39,760 sure how it really works. 927 00:38:39,760 --> 00:38:42,480 But for example, with the ant societies, 928 00:38:42,480 --> 00:38:45,110 no ant is higher than another ant. 929 00:38:45,110 --> 00:38:48,256 All ants are pretty much created equal. 930 00:38:48,256 --> 00:38:50,030 Yeah, say again? 931 00:38:50,030 --> 00:38:51,210 Except for the queen. 932 00:38:51,210 --> 00:38:52,590 But still, there is this myth. 933 00:38:52,590 --> 00:38:55,530 There is this myth that the queen ant was sitting there 934 00:38:55,530 --> 00:39:00,570 with her hyper brain, and just doing-- all right, tell forces 935 00:39:00,570 --> 00:39:03,500 A through B to go to sector G and attack, 936 00:39:03,500 --> 00:39:05,520 and then bring over the leaves here. 937 00:39:05,520 --> 00:39:06,020 No. 938 00:39:06,020 --> 00:39:07,620 The queen ant doesn't do that. 939 00:39:07,620 --> 00:39:09,369 I mean, it's just like all the other ants, 940 00:39:09,369 --> 00:39:11,010 and then it's probably three neurons 941 00:39:11,010 --> 00:39:13,770 and can't do much with them. 942 00:39:13,770 --> 00:39:16,920 But it's the interaction of these very simple things which 943 00:39:16,920 --> 00:39:20,150 can produce complex behavior. 944 00:39:20,150 --> 00:39:21,860 Yes, Atif. 945 00:39:21,860 --> 00:39:23,730 AUDIENCE: I just realized something. 946 00:39:23,730 --> 00:39:26,610 The thing is you're not even aware of your own thinking. 947 00:39:26,610 --> 00:39:31,630 And we often think that we're very self-reflective 948 00:39:31,630 --> 00:39:32,547 even though we're not. 949 00:39:32,547 --> 00:39:33,754 JUSTIN CURRY: All right, yes. 950 00:39:33,754 --> 00:39:35,640 We are-- we do think we're self-reflective. 951 00:39:35,640 --> 00:39:37,265 AUDIENCE: And even reason is beyond us. 952 00:39:37,265 --> 00:39:39,730 I mean, think, how many times have you ever reasoned ever 953 00:39:39,730 --> 00:39:42,914 since you were a [? child? ?] Most of the time, you've felt. 954 00:39:42,914 --> 00:39:45,080 You don't even know where the words are coming from. 955 00:39:45,080 --> 00:39:47,205 You don't know where your concepts are coming from. 956 00:39:47,205 --> 00:39:51,470 So how can we say we are highly self-reflective if we don't 957 00:39:51,470 --> 00:39:54,183 even, most of the time, reflect towards our own thoughts, 958 00:39:54,183 --> 00:39:55,474 our own feelings, and whatever? 959 00:39:55,474 --> 00:39:57,330 You know? 960 00:39:57,330 --> 00:39:59,130 JUSTIN CURRY: No, I mean-- 961 00:39:59,130 --> 00:39:59,880 right, exactly. 962 00:39:59,880 --> 00:40:02,490 Like, we tend to just kind of assimilate 963 00:40:02,490 --> 00:40:03,840 data about our outside world. 964 00:40:03,840 --> 00:40:06,540 And especially the mental models passed on to us 965 00:40:06,540 --> 00:40:09,060 from our parents, that immediately start 966 00:40:09,060 --> 00:40:11,850 telling us and training us and printing us, 967 00:40:11,850 --> 00:40:14,430 really, like [? geese ?] from our first day 968 00:40:14,430 --> 00:40:16,320 to do certain things. 969 00:40:16,320 --> 00:40:20,201 And we can rarely break out of those loops, 970 00:40:20,201 --> 00:40:21,075 out of those systems. 971 00:40:25,194 --> 00:40:28,530 But that's an interesting idea, right? 972 00:40:28,530 --> 00:40:35,770 It's this idea that what happens when we can't really scrape 973 00:40:35,770 --> 00:40:37,490 away at the boxes anymore? 974 00:40:37,490 --> 00:40:38,660 Like, we try to-- 975 00:40:38,660 --> 00:40:39,910 I mean, really, what happens-- 976 00:40:39,910 --> 00:40:41,540 I mean, why is psychology so difficult? 977 00:40:41,540 --> 00:40:43,930 Why is solving consciousness so hard? 978 00:40:43,930 --> 00:40:48,040 Because we're kind of trying to take out an eye ball and peer 979 00:40:48,040 --> 00:40:48,940 back out at us. 980 00:40:48,940 --> 00:40:51,550 And they're like, all right, what am I? 981 00:40:51,550 --> 00:40:54,730 And I remember having this discussion with one of you 982 00:40:54,730 --> 00:40:55,930 after class. 983 00:40:55,930 --> 00:40:57,560 It's this idea that-- 984 00:40:57,560 --> 00:41:01,400 so evolution really designed us to do certain things. 985 00:41:01,400 --> 00:41:06,100 And I suppose one reason why humans took off 986 00:41:06,100 --> 00:41:08,230 with these abnormally large forebrains 987 00:41:08,230 --> 00:41:12,262 was because it was a good method for survival. 988 00:41:12,262 --> 00:41:14,470 We suddenly started getting together in little groups 989 00:41:14,470 --> 00:41:16,720 and planting things in the ground. 990 00:41:16,720 --> 00:41:19,180 And suddenly, we had a really stable source of food. 991 00:41:19,180 --> 00:41:22,842 Or we would trick mammoths and run them off into cliffs, 992 00:41:22,842 --> 00:41:25,300 and have them fall on spikes that we designed and sharpened 993 00:41:25,300 --> 00:41:27,220 ourselves. 994 00:41:27,220 --> 00:41:28,810 And suddenly, problem solving was 995 00:41:28,810 --> 00:41:31,630 a really, really important thing for evolution 996 00:41:31,630 --> 00:41:35,140 by natural selection. 997 00:41:35,140 --> 00:41:37,930 People who had had these problem solving capabilities 998 00:41:37,930 --> 00:41:40,744 were selected favorably. 999 00:41:40,744 --> 00:41:45,760 But it's really nice to be able to run mammoths off cliffs 1000 00:41:45,760 --> 00:41:48,190 and plant things in dirt. 1001 00:41:48,190 --> 00:41:54,020 But what about the guy who is at the bottom of the pack, 1002 00:41:54,020 --> 00:41:56,390 thinking, what am I? 1003 00:41:56,390 --> 00:41:59,360 And then the cheetah [GROWLS] attacks, 1004 00:41:59,360 --> 00:42:02,681 as he's caught in these thought loops about what am I? 1005 00:42:02,681 --> 00:42:04,565 AUDIENCE: [INAUDIBLE] because it's always 1006 00:42:04,565 --> 00:42:06,217 like a crazy philosopher. 1007 00:42:06,217 --> 00:42:08,300 JUSTIN CURRY: Right, I mean, the philosopher was-- 1008 00:42:08,300 --> 00:42:12,200 AUDIENCE: I mean, you first get problem solving, 1009 00:42:12,200 --> 00:42:15,200 average, maybe perhaps above average a little bit. 1010 00:42:15,200 --> 00:42:19,210 And then you get these monsters like [INAUDIBLE] 1011 00:42:19,210 --> 00:42:20,580 that come along [INAUDIBLE]. 1012 00:42:20,580 --> 00:42:22,300 Or something like all the way over there. 1013 00:42:22,300 --> 00:42:24,390 We jump way too fast. 1014 00:42:24,390 --> 00:42:28,130 We don't have this infrastructure to keep yourself 1015 00:42:28,130 --> 00:42:29,570 from all of these enemies. 1016 00:42:29,570 --> 00:42:32,044 You've got to slow down a little bit. 1017 00:42:32,044 --> 00:42:33,960 JUSTIN CURRY: I mean, I think that's exactly-- 1018 00:42:33,960 --> 00:42:36,740 you do have these kind of figures which just seem-- 1019 00:42:36,740 --> 00:42:39,440 I mean, Archimedes for example. 1020 00:42:39,440 --> 00:42:42,650 If you want to talk about a guy who-- 1021 00:42:42,650 --> 00:42:46,340 the way one of my lecturers at Cambridge used to put him, 1022 00:42:46,340 --> 00:42:49,370 Dr. Piers Bursill-Hall, is he would describe these people 1023 00:42:49,370 --> 00:42:52,010 as nine dimensional, hyper-intelligent beings. 1024 00:42:52,010 --> 00:42:53,270 And their bodies were just-- 1025 00:42:53,270 --> 00:42:55,670 they're a four dimensional protrusion into our world. 1026 00:42:55,670 --> 00:42:59,899 So I'm like, that's because they were practically aliens 1027 00:42:59,899 --> 00:43:01,190 in terms of their intelligence. 1028 00:43:01,190 --> 00:43:04,110 And we just had no idea what they were doing. 1029 00:43:04,110 --> 00:43:05,930 And that's almost dangerous, right? 1030 00:43:05,930 --> 00:43:11,030 I mean, we tend not to like or really 1031 00:43:11,030 --> 00:43:15,730 go favorably with people who seem really out there. 1032 00:43:15,730 --> 00:43:21,020 And we kind of have a sad history of persecution. 1033 00:43:21,020 --> 00:43:26,095 Humans aren't always very nice to the above average. 1034 00:43:29,210 --> 00:43:32,602 I was bringing up this idea of having intelligence and having 1035 00:43:32,602 --> 00:43:34,060 these problem solving capabilities, 1036 00:43:34,060 --> 00:43:36,820 but that this ability to look back at ourselves 1037 00:43:36,820 --> 00:43:41,080 and be reflective as not being very good for evolution. 1038 00:43:41,080 --> 00:43:45,270 And I've actually heard it suggested that our brains are 1039 00:43:45,270 --> 00:43:48,000 actually evolved not to really solve 1040 00:43:48,000 --> 00:43:50,190 the problem of consciousness, because we're 1041 00:43:50,190 --> 00:43:52,850 spending too much time-- 1042 00:43:52,850 --> 00:43:55,420 well, because it wasn't favorable in terms of evolution 1043 00:43:55,420 --> 00:43:58,390 to actually be able to consider deeply and understand 1044 00:43:58,390 --> 00:44:00,580 what's going on internally, having an internal model 1045 00:44:00,580 --> 00:44:02,260 of all your thoughts. 1046 00:44:02,260 --> 00:44:05,080 We don't have a transcript of why did I make that decision. 1047 00:44:05,080 --> 00:44:07,240 Like, well, in sector A, this happened 1048 00:44:07,240 --> 00:44:09,180 and this neuron fired on that and this one. 1049 00:44:09,180 --> 00:44:11,470 And that is this, right? 1050 00:44:11,470 --> 00:44:13,110 We don't have access to that. 1051 00:44:13,110 --> 00:44:14,862 That's, in part, because evolution said, 1052 00:44:14,862 --> 00:44:16,570 we don't want to give you access to that. 1053 00:44:16,570 --> 00:44:18,520 We're going to put this together in a nice little rat brain, 1054 00:44:18,520 --> 00:44:20,770 and you're not going to have access to those thoughts. 1055 00:44:20,770 --> 00:44:23,470 You're just going to have their consequences. 1056 00:44:23,470 --> 00:44:25,064 I mean, it's a troubling idea. 1057 00:44:25,064 --> 00:44:26,730 Sandra, you were going to say something. 1058 00:44:26,730 --> 00:44:30,618 AUDIENCE: Is there any advancement in human evolvement 1059 00:44:30,618 --> 00:44:33,272 besides logical thinking [INAUDIBLE]?? 1060 00:44:33,272 --> 00:44:35,730 JUSTIN CURRY: Is there any way to advance logical thinking, 1061 00:44:35,730 --> 00:44:37,020 to essentially advance humans? 1062 00:44:37,020 --> 00:44:39,270 AUDIENCE: Human advancement through thinking. 1063 00:44:39,270 --> 00:44:41,490 Maybe there's [INAUDIBLE] better. 1064 00:44:41,490 --> 00:44:47,000 I mean, we did evolve from [INAUDIBLE] simple [INAUDIBLE].. 1065 00:44:47,000 --> 00:44:47,970 JUSTIN CURRY: Right. 1066 00:44:47,970 --> 00:44:52,300 To an extent, we are right now, in this classroom. 1067 00:44:52,300 --> 00:44:56,070 And the fact that most of us, including myself, 1068 00:44:56,070 --> 00:44:56,910 we have bad vision. 1069 00:44:56,910 --> 00:44:59,490 Like, if I were really stuck out in the field, 1070 00:44:59,490 --> 00:45:03,180 I wouldn't have a chance to sit and look over books. 1071 00:45:03,180 --> 00:45:08,211 I'd be fetally hunting and not succeeding, and dying. 1072 00:45:08,211 --> 00:45:10,710 But our society has reached a level of technology and health 1073 00:45:10,710 --> 00:45:14,280 care, where people can go on and we 1074 00:45:14,280 --> 00:45:16,714 don't have to worry about base survival needs. 1075 00:45:16,714 --> 00:45:18,880 And I don't know how many of you ever heard of this, 1076 00:45:18,880 --> 00:45:22,010 but Maslow's Hierarchy of Needs. 1077 00:45:22,010 --> 00:45:24,891 Anyone heard of Maslow? 1078 00:45:24,891 --> 00:45:26,890 Have any of you guys done debate in high school? 1079 00:45:26,890 --> 00:45:30,450 Do you even have debate teams or classes? 1080 00:45:30,450 --> 00:45:32,700 They don't have these programs here? 1081 00:45:32,700 --> 00:45:33,840 Gasp! 1082 00:45:33,840 --> 00:45:36,330 All right. 1083 00:45:36,330 --> 00:45:43,180 So Maslow said we have this hierarchy of needs 1084 00:45:43,180 --> 00:45:44,850 before we can do anything. 1085 00:45:44,850 --> 00:45:57,130 And at the top here, he had essentially transcendence. 1086 00:45:57,130 --> 00:46:00,040 I'm actually blanking on the exact word which he used. 1087 00:46:00,040 --> 00:46:01,370 This is embarrassing. 1088 00:46:01,370 --> 00:46:05,320 But either way, I want you to Google this. 1089 00:46:05,320 --> 00:46:09,000 Abraham Maslow. 1090 00:46:09,000 --> 00:46:10,750 We have Hierarchy of Needs. 1091 00:46:15,420 --> 00:46:17,720 I can't spell either. 1092 00:46:17,720 --> 00:46:19,010 Is there an "a" here? 1093 00:46:19,010 --> 00:46:20,865 No. 1094 00:46:20,865 --> 00:46:24,290 It's just R-- H-I-E-R, Hierarchy of Needs. 1095 00:46:24,290 --> 00:46:26,460 Thank you. 1096 00:46:26,460 --> 00:46:28,274 And we have this-- 1097 00:46:28,274 --> 00:46:29,065 self-actualisation! 1098 00:46:29,065 --> 00:46:29,565 Ha-ha! 1099 00:46:29,565 --> 00:46:30,520 Ha-ha! 1100 00:46:30,520 --> 00:46:31,209 Here we go. 1101 00:46:40,320 --> 00:46:46,410 And we have food, water, shelter. 1102 00:46:46,410 --> 00:46:48,770 And this is really what humanity spent 1103 00:46:48,770 --> 00:46:52,670 most of our early development, just staying at this level. 1104 00:46:52,670 --> 00:46:54,934 CURRAN KELLEHER: Here it is. 1105 00:46:54,934 --> 00:46:56,100 JUSTIN CURRY: Ah, fantastic. 1106 00:46:59,310 --> 00:47:03,580 And he roughly describes this as physiological needs. 1107 00:47:08,200 --> 00:47:11,360 And then we have safety. 1108 00:47:11,360 --> 00:47:15,510 So we have security of body. 1109 00:47:15,510 --> 00:47:19,830 I'm just going to go ahead and write safety in here. 1110 00:47:19,830 --> 00:47:21,291 Pull that out there. 1111 00:47:21,291 --> 00:47:21,790 Safety. 1112 00:47:24,750 --> 00:47:25,880 Love, belonging. 1113 00:47:34,510 --> 00:47:36,390 Esteem. 1114 00:47:36,390 --> 00:47:38,550 We have to at least think somewhat 1115 00:47:38,550 --> 00:47:41,540 positively of yourself. 1116 00:47:41,540 --> 00:47:43,460 And then we get to self-actualization, 1117 00:47:43,460 --> 00:47:47,900 which Wikipedia describes as morality, creativity, 1118 00:47:47,900 --> 00:47:50,870 spontaneity, problem solving, lack of prejudice, 1119 00:47:50,870 --> 00:47:53,780 acceptance of facts. 1120 00:47:53,780 --> 00:47:58,130 And really the argument-- and a lot of people use this 1121 00:47:58,130 --> 00:48:00,380 in a scary, slippery slope-- 1122 00:48:00,380 --> 00:48:03,590 say, well, look, we're going to protect you by constant 24 hour 1123 00:48:03,590 --> 00:48:04,490 surveillance. 1124 00:48:04,490 --> 00:48:07,010 Because before you guys can even think 1125 00:48:07,010 --> 00:48:10,220 of doing anything up here, you just 1126 00:48:10,220 --> 00:48:11,780 need to be safe in your homes. 1127 00:48:11,780 --> 00:48:14,630 You need to be safe from terrorists. 1128 00:48:14,630 --> 00:48:19,640 So before you can even do any of this, you've got to have this. 1129 00:48:19,640 --> 00:48:22,940 Which is funny, because we tend to do things which 1130 00:48:22,940 --> 00:48:25,784 would infringe on inherent properties 1131 00:48:25,784 --> 00:48:27,700 of self-actualisation, like freedom of speech, 1132 00:48:27,700 --> 00:48:29,408 freedom of thought, et cetera, et cetera. 1133 00:48:34,100 --> 00:48:37,320 You can, in some ways, say that humans have been 1134 00:48:37,320 --> 00:48:38,670 trying to climb up this ladder. 1135 00:48:38,670 --> 00:48:42,450 And that even us as individuals in our lives, 1136 00:48:42,450 --> 00:48:45,870 our parents provided us with food and shelter, 1137 00:48:45,870 --> 00:48:48,270 and security and safety. 1138 00:48:48,270 --> 00:48:50,590 And good parents should give us a feeling 1139 00:48:50,590 --> 00:48:52,350 of love and belonging. 1140 00:48:52,350 --> 00:48:56,430 And this then enables us to do things like go to college 1141 00:48:56,430 --> 00:48:59,921 and think about Immanuel Kant and other philosophers , 1142 00:48:59,921 --> 00:49:00,420 and stuff. 1143 00:49:00,420 --> 00:49:01,320 Yes, Atif. 1144 00:49:01,320 --> 00:49:03,310 AUDIENCE: Couldn't like [? had been ?] 1145 00:49:03,310 --> 00:49:06,890 under 24 hour surveillance sort of take away 1146 00:49:06,890 --> 00:49:09,000 our feeling of safety? 1147 00:49:09,000 --> 00:49:11,910 JUSTIN CURRY: Right, so the argument goes both ways. 1148 00:49:11,910 --> 00:49:16,020 We could also not feel safe by being surveyed. 1149 00:49:16,020 --> 00:49:19,860 But this is a debate for policy makers, 1150 00:49:19,860 --> 00:49:23,302 not necessarily for me right now. 1151 00:49:23,302 --> 00:49:24,010 But you're right. 1152 00:49:24,010 --> 00:49:25,634 I mean, Sandra, I'm glad you brought up 1153 00:49:25,634 --> 00:49:28,105 this point of this idea of humans trying to advance. 1154 00:49:30,610 --> 00:49:33,840 And then it's kind of sad to say, 1155 00:49:33,840 --> 00:49:35,790 but most of us here in this classroom 1156 00:49:35,790 --> 00:49:39,000 are really occupied with-- 1157 00:49:39,000 --> 00:49:42,200 we're all up here in this triangle. 1158 00:49:42,200 --> 00:49:43,950 And we're worried, well, what about logic? 1159 00:49:43,950 --> 00:49:45,660 Well, is computer science going to lead us to-- 1160 00:49:45,660 --> 00:49:48,000 and we're worried about this very tippy peak, which 1161 00:49:48,000 --> 00:49:50,232 is enlightenment. 1162 00:49:50,232 --> 00:49:52,440 Whereas there are plenty of people out in the streets 1163 00:49:52,440 --> 00:49:55,890 today who are worrying about these things, 1164 00:49:55,890 --> 00:49:59,202 they're not worried about is Godel's Incompleteness 1165 00:49:59,202 --> 00:50:01,160 Theorem going to give me my final understanding 1166 00:50:01,160 --> 00:50:03,440 of consciousness in the brain. 1167 00:50:03,440 --> 00:50:05,480 I mean, just think about how far of a level 1168 00:50:05,480 --> 00:50:08,300 that is, and even think of how far of a level 1169 00:50:08,300 --> 00:50:14,150 we have historically come as people in escaping 1170 00:50:14,150 --> 00:50:19,770 the brutish and harsh natural selection on this level. 1171 00:50:19,770 --> 00:50:23,700 And we've now been engaging in a selection on this level. 1172 00:50:23,700 --> 00:50:26,400 And as you guys watch In Waking Life next lecture, 1173 00:50:26,400 --> 00:50:28,430 you'll actually get a really pumped up 1174 00:50:28,430 --> 00:50:30,830 philosopher who will talk, on this film, 1175 00:50:30,830 --> 00:50:35,240 about this idea of a neoevolution and the neohuman. 1176 00:50:35,240 --> 00:50:38,060 What happens when evolution starts 1177 00:50:38,060 --> 00:50:41,280 no longer acting on terms of genes and security and safety, 1178 00:50:41,280 --> 00:50:41,780 here? 1179 00:50:41,780 --> 00:50:43,700 Because, we as humans, we don't usually 1180 00:50:43,700 --> 00:50:45,140 worry about this anymore. 1181 00:50:45,140 --> 00:50:48,000 Like, now evolution is happening on a cultural level, 1182 00:50:48,000 --> 00:50:53,030 on a memetic level, for those of you who know what memes are. 1183 00:50:53,030 --> 00:50:57,920 But we're going tangential right now, 1184 00:50:57,920 --> 00:51:00,490 and I want to pass things over to Curran, 1185 00:51:00,490 --> 00:51:04,760 who's got lots of goodies for you, once again. 1186 00:51:04,760 --> 00:51:09,650 And go through this idea of climbing up 1187 00:51:09,650 --> 00:51:16,280 pyramids, and this interplay of levels and descriptions. 1188 00:51:16,280 --> 00:51:18,790 As such, you guys get your five minute break 1189 00:51:18,790 --> 00:51:20,450 while we reorganize things. 1190 00:51:20,450 --> 00:51:23,197 And see you all back here. 1191 00:51:23,197 --> 00:51:24,822 AUDIENCE: So what are some consequences 1192 00:51:24,822 --> 00:51:27,282 if those [INAUDIBLE]? 1193 00:51:27,282 --> 00:51:28,452 Unbalanced? 1194 00:51:28,452 --> 00:51:30,910 JUSTIN CURRY: So what are some consequences if those things 1195 00:51:30,910 --> 00:51:32,560 aren't balanced, right? 1196 00:51:32,560 --> 00:51:36,410 So what happens if you suddenly don't feel safe? 1197 00:51:36,410 --> 00:51:40,390 OK, well, I think some clear consequences, based 1198 00:51:40,390 --> 00:51:41,860 at least on this model, is the idea 1199 00:51:41,860 --> 00:51:44,470 that if we don't feel safe, we're 1200 00:51:44,470 --> 00:51:48,754 not going to want to pursue anything higher. 1201 00:51:48,754 --> 00:51:50,170 We're definitely not going to feel 1202 00:51:50,170 --> 00:51:52,086 loved if we don't feel safe. 1203 00:51:52,086 --> 00:51:53,710 And we're not going to feel-- we're not 1204 00:51:53,710 --> 00:51:55,251 going to have any kind of self-esteem 1205 00:51:55,251 --> 00:51:56,771 if we don't feel loved. 1206 00:51:56,771 --> 00:51:58,854 AUDIENCE: How about [? Hitler? ?] I don't think he 1207 00:51:58,854 --> 00:52:00,790 felt [? sad ?] when he was doing it. 1208 00:52:00,790 --> 00:52:01,840 JUSTIN CURRY: OK. 1209 00:52:01,840 --> 00:52:03,160 So you're right. 1210 00:52:03,160 --> 00:52:07,570 There are these few kind of crazy characters in history 1211 00:52:07,570 --> 00:52:12,370 which have been ultra paranoid, not felt loved, and just 1212 00:52:12,370 --> 00:52:14,920 been essentially schizophrenic, right? 1213 00:52:14,920 --> 00:52:16,120 Look at Isaac Newton. 1214 00:52:16,120 --> 00:52:18,620 If you want to talk about a not nice man, 1215 00:52:18,620 --> 00:52:20,620 he was a very, very evil man. 1216 00:52:20,620 --> 00:52:23,700 In fact, as his, I think, 32 years as 1217 00:52:23,700 --> 00:52:28,720 head the Mint, where he's responsible for giving people 1218 00:52:28,720 --> 00:52:30,550 their lives back-- because in those days, 1219 00:52:30,550 --> 00:52:33,970 clipping coins was punishable by hanging. 1220 00:52:33,970 --> 00:52:36,617 And he would never, never clear anyone's sentence. 1221 00:52:36,617 --> 00:52:38,950 I mean, everybody who went up for coin clipping got hung 1222 00:52:38,950 --> 00:52:39,780 underneath Newton's-- 1223 00:52:39,780 --> 00:52:40,904 AUDIENCE: What is clipping? 1224 00:52:40,904 --> 00:52:42,490 JUSTIN CURRY: So back in the day, 1225 00:52:42,490 --> 00:52:45,310 coins were actually silver because the metal 1226 00:52:45,310 --> 00:52:46,740 itself had inherent value. 1227 00:52:46,740 --> 00:52:48,115 So what people used to do is they 1228 00:52:48,115 --> 00:52:50,860 would clip little chips of silver from it every time 1229 00:52:50,860 --> 00:52:52,360 they came into possession of a coin, 1230 00:52:52,360 --> 00:52:53,890 and they'd have a pile of silver. 1231 00:52:53,890 --> 00:52:55,810 And you could turn it into another coin. 1232 00:52:55,810 --> 00:52:59,530 So you just had another coin spring from nothing. 1233 00:52:59,530 --> 00:53:02,410 Of course, you can imagine, you would 1234 00:53:02,410 --> 00:53:04,660 have to write letters to the warden of the Mint, 1235 00:53:04,660 --> 00:53:08,920 saying things like, oh, I'm so sorry Mr. Newton, sir. 1236 00:53:08,920 --> 00:53:10,930 I clipped only two coins and it was 1237 00:53:10,930 --> 00:53:13,760 so I could buy an extra loaf of bread for my family. 1238 00:53:13,760 --> 00:53:16,970 Mrs. has another bun in the oven. 1239 00:53:16,970 --> 00:53:19,430 I've got three kids I need to feed. 1240 00:53:19,430 --> 00:53:21,640 And this guy is probably illiterate too 1241 00:53:21,640 --> 00:53:25,000 and struggling with writing this letter, right? 1242 00:53:25,000 --> 00:53:28,510 And Newton's like, [LAUGHING] hang him. 1243 00:53:28,510 --> 00:53:30,987 So he wasn't a very nice person, but he was brilliant. 1244 00:53:30,987 --> 00:53:33,070 AUDIENCE: He was very [? extra ?] [? apathetic. ?] 1245 00:53:33,070 --> 00:53:35,980 JUSTIN CURRY: And yeah, he kind of went higher up this pyramid 1246 00:53:35,980 --> 00:53:39,040 than any of us can really hope to. 1247 00:53:39,040 --> 00:53:43,440 So yeah, there are some defects in this model. 1248 00:53:43,440 --> 00:53:45,980 AUDIENCE: [INAUDIBLE] didn't you actually 1249 00:53:45,980 --> 00:53:48,199 say morality was one of them? 1250 00:53:48,199 --> 00:53:48,990 JUSTIN CURRY: Yeah. 1251 00:53:48,990 --> 00:53:50,927 AUDIENCE: So you can sort of get rid of it? 1252 00:53:50,927 --> 00:53:53,010 JUSTIN CURRY: [INAUDIBLE] you don't need morality. 1253 00:53:53,010 --> 00:53:54,720 But I mean, ideally, we'd have this kind of-- now, 1254 00:53:54,720 --> 00:53:57,360 you have to also remember Maslow was a philosopher of the '60s 1255 00:53:57,360 --> 00:54:00,240 and '70s, and believed that enlightened people would be 1256 00:54:00,240 --> 00:54:01,920 loving and happy, all the time. 1257 00:54:04,860 --> 00:54:05,840 You want this, Curran? 1258 00:54:05,840 --> 00:54:08,780 CURRAN KELLEHER: Yeah. 1259 00:54:08,780 --> 00:54:13,480 AUDIENCE: But can you actually say Newton was enlightened? 1260 00:54:13,480 --> 00:54:16,540 JUSTIN CURRY: No, I don't think you can. 1261 00:54:16,540 --> 00:54:22,280 But if you want to talk about the qualities of intelligence 1262 00:54:22,280 --> 00:54:25,190 and problem solving-- 1263 00:54:25,190 --> 00:54:26,690 even when he was old and people were 1264 00:54:26,690 --> 00:54:31,280 trying to solve the problem of the brachistochrone, 1265 00:54:31,280 --> 00:54:32,600 the shortest-- 1266 00:54:32,600 --> 00:54:35,890 what shape slide would enable, when 1267 00:54:35,890 --> 00:54:38,240 you drop a ball down it, to go from point A 1268 00:54:38,240 --> 00:54:40,760 to point B in the least amount of time? 1269 00:54:40,760 --> 00:54:43,430 Like, this was being circulated through European and British 1270 00:54:43,430 --> 00:54:46,480 mathematical journals for years. 1271 00:54:46,480 --> 00:54:48,540 Nobody really came up with a good solution. 1272 00:54:48,540 --> 00:54:51,262 So somebody finally passed it off to Newton. 1273 00:54:51,262 --> 00:54:54,280 He comes in at like 65, and he just kind of goes, 1274 00:54:54,280 --> 00:54:59,270 [SCOFFING] don't insult me with these easy problems. 1275 00:54:59,270 --> 00:55:03,870 Stood up that night and solved it in a couple of hours. 1276 00:55:03,870 --> 00:55:05,870 I mean, that's just the kind of guy that he was. 1277 00:55:10,770 --> 00:55:13,709 AUDIENCE: So would you say that he let anyone off? 1278 00:55:13,709 --> 00:55:15,000 JUSTIN CURRY: I don't think so. 1279 00:55:15,000 --> 00:55:17,150 I don't think he ever let anyone off for 32 1280 00:55:17,150 --> 00:55:18,704 years of being under the Mint. 1281 00:55:24,560 --> 00:55:26,600 So Curran, do you want to talk about this? 1282 00:55:26,600 --> 00:55:28,600 CURRAN KELLEHER: You can start talking about it. 1283 00:55:28,600 --> 00:55:29,308 JUSTIN CURRY: OK. 1284 00:55:35,250 --> 00:55:38,490 All right, so this is actually-- 1285 00:55:38,490 --> 00:55:42,322 this gives you an idea of how Curran and I plan for lectures. 1286 00:55:42,322 --> 00:55:44,780 Actually, this the only time we've ever used a white board. 1287 00:55:47,400 --> 00:55:50,430 So we had this idea of-- 1288 00:55:54,764 --> 00:55:56,680 I wrote this question on the board for Curran. 1289 00:55:56,680 --> 00:56:00,610 I said, if he had to administer a test to decide whether 1290 00:56:00,610 --> 00:56:03,130 or not-- and it kind of runs off the edge-- 1291 00:56:03,130 --> 00:56:05,140 someone was destined to be a computer scientist, 1292 00:56:05,140 --> 00:56:05,848 what would it be? 1293 00:56:05,848 --> 00:56:07,577 AUDIENCE: [INAUDIBLE] computer. 1294 00:56:07,577 --> 00:56:09,160 JUSTIN CURRY: It had to be a computer. 1295 00:56:09,160 --> 00:56:14,920 [LAUGHING] Yeah, I guess it got clipped somehow. 1296 00:56:14,920 --> 00:56:19,620 But either way, just by forcing you guys 1297 00:56:19,620 --> 00:56:24,420 to vote by those boxes, I went ahead and had this-- 1298 00:56:24,420 --> 00:56:26,700 I was trying to get at what levels are 1299 00:56:26,700 --> 00:56:30,690 you guys most interested in? 1300 00:56:30,690 --> 00:56:34,610 And I talked about if you're a physicist, 1301 00:56:34,610 --> 00:56:40,610 and you're sitting in high school biology, 1302 00:56:40,610 --> 00:56:44,937 someone might tell you about organs and things like this. 1303 00:56:44,937 --> 00:56:46,520 And if you were a physicist, you would 1304 00:56:46,520 --> 00:56:49,400 be like, OK, great, whatever. 1305 00:56:49,400 --> 00:56:50,200 Tell me more. 1306 00:56:50,200 --> 00:56:51,158 And they'd be like, OK. 1307 00:56:51,158 --> 00:56:55,040 Well, these organisms are made out of tissues and cells. 1308 00:56:55,040 --> 00:56:57,410 And the physicist would say, ah, fine. 1309 00:56:57,410 --> 00:56:59,210 Yeah, what else? 1310 00:56:59,210 --> 00:57:01,920 Well, those are made out of these big proteins 1311 00:57:01,920 --> 00:57:03,798 and these organic molecules. 1312 00:57:03,798 --> 00:57:07,310 The physicist would be like, OK. 1313 00:57:07,310 --> 00:57:10,040 And those are made of chemicals, which are then made of atoms. 1314 00:57:10,040 --> 00:57:11,665 And then they'd be like, ah, now you're 1315 00:57:11,665 --> 00:57:13,280 starting to make me interested. 1316 00:57:13,280 --> 00:57:15,980 And so then a physicist really occupies themselves 1317 00:57:15,980 --> 00:57:19,740 with pushing downwards on this level 1318 00:57:19,740 --> 00:57:22,975 as far as possible-- particles, forces, quarks, strings. 1319 00:57:25,466 --> 00:57:26,840 So you can almost characterize it 1320 00:57:26,840 --> 00:57:29,125 as a process of reductionism. 1321 00:57:29,125 --> 00:57:30,500 And this is even true when you're 1322 00:57:30,500 --> 00:57:34,290 thinking about large things like supernovae, galaxies. 1323 00:57:34,290 --> 00:57:36,080 You're trying to reduce these large scale 1324 00:57:36,080 --> 00:57:38,270 phenomenon in terms of explaining them 1325 00:57:38,270 --> 00:57:39,770 in fundamental forces-- 1326 00:57:39,770 --> 00:57:43,710 f equals ma, things like this. 1327 00:57:43,710 --> 00:57:46,086 But then what about-- 1328 00:57:46,086 --> 00:57:48,170 I forget what's the statistic, but I want to say 1329 00:57:48,170 --> 00:57:54,680 something like 50% plus of all people who-- 1330 00:57:54,680 --> 00:57:56,570 for undergraduate majors, do humanities, 1331 00:57:56,570 --> 00:57:59,930 things like law, history, creative writing, 1332 00:57:59,930 --> 00:58:02,990 things like this. 1333 00:58:02,990 --> 00:58:05,360 If you're one of these people, you don't ever even 1334 00:58:05,360 --> 00:58:08,150 look below this line. 1335 00:58:08,150 --> 00:58:11,664 You're really caring about what happens here, right? 1336 00:58:11,664 --> 00:58:13,580 What happens when you take your basic building 1337 00:58:13,580 --> 00:58:18,470 blocks of say fear, hunger, desire, and selfishness, 1338 00:58:18,470 --> 00:58:21,050 and you take linear combinations of them. 1339 00:58:21,050 --> 00:58:25,400 And you're like, well, I'm going to give like 75% fear 1340 00:58:25,400 --> 00:58:27,110 with a little bit of desire. 1341 00:58:27,110 --> 00:58:30,140 And that's going to be a new emotion called apprehension. 1342 00:58:30,140 --> 00:58:32,460 And then as you kind of build up here, 1343 00:58:32,460 --> 00:58:34,370 you start going with levels of description 1344 00:58:34,370 --> 00:58:36,620 starting on the levels of emotions, 1345 00:58:36,620 --> 00:58:38,755 and then building up the things. 1346 00:58:38,755 --> 00:58:40,130 Especially if you're a literature 1347 00:58:40,130 --> 00:58:42,088 major, interested in what your language can do. 1348 00:58:44,900 --> 00:58:48,650 So then we keep asking, so like, what's an engineer? 1349 00:58:48,650 --> 00:58:51,670 I mean, who here thinks they like engineering? 1350 00:58:51,670 --> 00:58:52,190 OK, great. 1351 00:58:52,190 --> 00:58:58,430 So you guys see, or you might see, everyday materials. 1352 00:59:00,675 --> 00:59:02,550 You're kind of like your own little MacGyver. 1353 00:59:02,550 --> 00:59:06,230 Like, give me a toothpick, a rubber band, 1354 00:59:06,230 --> 00:59:12,180 and a piece of dough, and I can pick somebody's lock. 1355 00:59:12,180 --> 00:59:15,570 I mean, that's just what MacGyver does. 1356 00:59:15,570 --> 00:59:18,270 And you're always-- or I hate to generalize, 1357 00:59:18,270 --> 00:59:20,950 but you're interested in how you can 1358 00:59:20,950 --> 00:59:24,910 take base, ready, efficient materials around you 1359 00:59:24,910 --> 00:59:28,191 and then create practical, effective solutions. 1360 00:59:28,191 --> 00:59:30,190 And you're always kind of working in this domain 1361 00:59:30,190 --> 00:59:32,560 here, maybe even going from small projects 1362 00:59:32,560 --> 00:59:33,510 to city wide ones. 1363 00:59:37,930 --> 00:59:42,130 So then still we ask, so what makes a computer scientist? 1364 00:59:42,130 --> 00:59:44,130 And I say that really, fundamentally, a computer 1365 00:59:44,130 --> 00:59:47,290 science is a lot like a mathematician, who 1366 00:59:47,290 --> 00:59:50,012 doesn't really explore the physical world 1367 00:59:50,012 --> 00:59:51,970 and its layers of abstraction, but instead live 1368 00:59:51,970 --> 00:59:54,550 in kind of a Platonic world. 1369 00:59:54,550 --> 00:59:57,550 And their atoms are sequences and series and logics 1370 00:59:57,550 --> 01:00:02,440 and grammars and calculus and algebra and geometry. 1371 01:00:02,440 --> 01:00:05,470 And you get to play around with these things. 1372 01:00:08,620 --> 01:00:11,772 But I would say that inherently what 1373 01:00:11,772 --> 01:00:13,980 makes a mathematician a computer scientist is looking 1374 01:00:13,980 --> 01:00:19,650 at just patterns, and kind of abstracting away any details. 1375 01:00:19,650 --> 01:00:22,050 I was really sad when I stopped really 1376 01:00:22,050 --> 01:00:24,769 being interested in biology, because I used to love biology. 1377 01:00:24,769 --> 01:00:26,060 I'm like, oh, isn't this great? 1378 01:00:26,060 --> 01:00:27,630 Look what evolution produces. 1379 01:00:27,630 --> 01:00:30,160 But really, once I learned about the theory of evolution, 1380 01:00:30,160 --> 01:00:34,580 I realized, well, that pretty much explains it. 1381 01:00:34,580 --> 01:00:38,080 You have a couple of roll of the die, 1382 01:00:38,080 --> 01:00:44,330 you have genetic mutation, phenotypes, selection, repeat. 1383 01:00:44,330 --> 01:00:45,200 Right? 1384 01:00:45,200 --> 01:00:47,660 Once the theory was there in front of me-- 1385 01:00:47,660 --> 01:00:50,300 well, I'm glad Darwin solved that problem. 1386 01:00:50,300 --> 01:00:51,794 I don't have to worry about it. 1387 01:00:51,794 --> 01:00:53,960 And suddenly, biology became less interesting to me, 1388 01:00:53,960 --> 01:00:57,585 just because what, I guess, ultimately boils-- 1389 01:00:57,585 --> 01:00:58,460 AUDIENCE: [INAUDIBLE] 1390 01:00:58,460 --> 01:00:59,793 JUSTIN CURRY: Say again, Sandra? 1391 01:00:59,793 --> 01:01:01,267 AUDIENCE: [INAUDIBLE]. 1392 01:01:01,267 --> 01:01:02,100 JUSTIN CURRY: Right. 1393 01:01:02,100 --> 01:01:03,660 I mean, the theory was there. 1394 01:01:03,660 --> 01:01:06,660 And that kind of I guess puts me at this theorist 1395 01:01:06,660 --> 01:01:10,950 and philosopher, is that I don't care about the details. 1396 01:01:10,950 --> 01:01:12,690 And people spend their entire lives 1397 01:01:12,690 --> 01:01:15,960 like studying the action of this protein on thing x 1398 01:01:15,960 --> 01:01:18,420 and y, right? 1399 01:01:18,420 --> 01:01:20,520 But I don't really care about the detail, just 1400 01:01:20,520 --> 01:01:23,900 the conceptual process. 1401 01:01:23,900 --> 01:01:26,600 But I don't want to bias any of you guys. 1402 01:01:26,600 --> 01:01:29,930 I mean, everybody has their own take at things. 1403 01:01:29,930 --> 01:01:33,530 And even Curran and I had differing opinions 1404 01:01:33,530 --> 01:01:38,850 about what was interesting and what you guys find interesting. 1405 01:01:38,850 --> 01:01:40,604 So I want to turn things over to him. 1406 01:01:40,604 --> 01:01:42,020 CURRAN KELLEHER: So something that 1407 01:01:42,020 --> 01:01:43,970 was really cool, that we noticed about these, 1408 01:01:43,970 --> 01:01:49,010 is the universality of things being on lower levels, 1409 01:01:49,010 --> 01:01:52,160 and things emerging out of those lower levels 1410 01:01:52,160 --> 01:01:54,140 into higher levels of things. 1411 01:01:54,140 --> 01:01:57,830 Like electrical engineering-- like 1412 01:01:57,830 --> 01:01:59,150 he said, transistors and stuff. 1413 01:01:59,150 --> 01:02:01,700 You have all these different layers 1414 01:02:01,700 --> 01:02:04,920 of things that can happen to software, with neurons, 1415 01:02:04,920 --> 01:02:06,440 and the brain, and thoughts. 1416 01:02:06,440 --> 01:02:12,980 And with biology, DNA and RNA and replication, and whatnot. 1417 01:02:12,980 --> 01:02:14,990 Reproduction, and then that leads to evolution, 1418 01:02:14,990 --> 01:02:18,180 which is an emergent property. 1419 01:02:18,180 --> 01:02:22,100 So I'm going to show you some programs 1420 01:02:22,100 --> 01:02:28,450 that I wrote, that exhibit some emergent properties. 1421 01:02:28,450 --> 01:02:35,480 So this is sort of a physics simulation, but it's discrete. 1422 01:02:35,480 --> 01:02:36,850 It's not continuous. 1423 01:02:36,850 --> 01:02:42,310 And physics is modeled as continuous integrals and stuff 1424 01:02:42,310 --> 01:02:43,490 like that. 1425 01:02:43,490 --> 01:02:47,420 So here, I'm sort of doing integration, but it's discrete. 1426 01:02:47,420 --> 01:02:50,890 So you have this sort of weird error 1427 01:02:50,890 --> 01:02:52,870 that makes it not exactly like physics, 1428 01:02:52,870 --> 01:02:54,430 but it's still really cool. 1429 01:02:57,850 --> 01:03:01,750 The blue and the red things have different charges. 1430 01:03:01,750 --> 01:03:05,350 And there are coulombic courses acting between them. 1431 01:03:05,350 --> 01:03:07,120 So I'll just play with it. 1432 01:03:07,120 --> 01:03:12,819 Coulombic forces is like plus and minus, opposites attract. 1433 01:03:12,819 --> 01:03:14,860 And when they're the same, they repel each other. 1434 01:03:17,510 --> 01:03:21,279 And so we can change all these properties. 1435 01:03:25,680 --> 01:03:29,770 And the color reflects how charged they are. 1436 01:03:34,900 --> 01:03:36,510 So these things, it's really weird. 1437 01:03:36,510 --> 01:03:40,050 Like, this would never happen in real physics. 1438 01:03:40,050 --> 01:03:43,430 Like, where's this strange energy coming from? 1439 01:03:43,430 --> 01:03:47,530 And I think it's because it's discrete. 1440 01:03:47,530 --> 01:03:49,930 It's discretized and not continuous, 1441 01:03:49,930 --> 01:03:53,940 but it's still pretty cool, pretty interesting. 1442 01:03:53,940 --> 01:03:56,900 Yeah, repulsion. 1443 01:03:56,900 --> 01:03:58,153 And now it's stable. 1444 01:04:04,479 --> 01:04:06,770 What I want you to keep in mind in looking at all these 1445 01:04:06,770 --> 01:04:08,640 is that with every one of these examples, 1446 01:04:08,640 --> 01:04:12,290 its very simple rules are pretty simple rules that 1447 01:04:12,290 --> 01:04:15,110 govern the behavior of each individual one. 1448 01:04:15,110 --> 01:04:18,260 And the rules that govern one of these particles, 1449 01:04:18,260 --> 01:04:20,600 one of these balls, is no different than the rules 1450 01:04:20,600 --> 01:04:23,190 that govern all the other ones. 1451 01:04:23,190 --> 01:04:25,460 So these simple rules that are local 1452 01:04:25,460 --> 01:04:28,160 lead to global phenomenon, emergent behavior. 1453 01:04:28,160 --> 01:04:30,170 This is what emergence is all about-- 1454 01:04:30,170 --> 01:04:34,580 things at higher levels emerging from simpler things 1455 01:04:34,580 --> 01:04:35,330 on lower levels. 1456 01:04:39,820 --> 01:04:43,110 So for example, crystallization in nature 1457 01:04:43,110 --> 01:04:45,460 is an example of an emergent property. 1458 01:04:45,460 --> 01:04:48,140 So here is a set of presets, a set of settings 1459 01:04:48,140 --> 01:04:51,660 that leads to crystallization. 1460 01:04:51,660 --> 01:04:55,032 It's tuned to crystallize. 1461 01:04:55,032 --> 01:04:57,740 JUSTIN CURRY: And see, that's an interesting thing to point out 1462 01:04:57,740 --> 01:05:01,150 is that we have this level of description, 1463 01:05:01,150 --> 01:05:02,310 which you described. 1464 01:05:02,310 --> 01:05:04,470 We call this a crystal. 1465 01:05:04,470 --> 01:05:06,740 We don't say, it's the arrangement range 1466 01:05:06,740 --> 01:05:10,350 when particle I acts on particle J, 1467 01:05:10,350 --> 01:05:12,770 and has the following charge. 1468 01:05:12,770 --> 01:05:15,510 Like, that level of description is too fundamental. 1469 01:05:15,510 --> 01:05:19,092 But this higher level description of a crystal 1470 01:05:19,092 --> 01:05:21,648 is much more convenient, right? 1471 01:05:21,648 --> 01:05:22,564 CURRAN KELLEHER: Yeah. 1472 01:05:29,008 --> 01:05:29,508 Yeah, Atif. 1473 01:05:29,508 --> 01:05:32,318 AUDIENCE: Even if we [? do say ?] crystal, 1474 01:05:32,318 --> 01:05:34,800 we may need exactly the same thing. 1475 01:05:34,800 --> 01:05:37,500 It's like we're just dodging around exactly 1476 01:05:37,500 --> 01:05:39,666 what the thing is. 1477 01:05:39,666 --> 01:05:41,790 CURRAN KELLEHER: You said that when we say crystal, 1478 01:05:41,790 --> 01:05:43,123 we're not really saying a thing. 1479 01:05:43,123 --> 01:05:46,740 We're saying-- we're sort of beating around the bush, right? 1480 01:05:46,740 --> 01:05:47,240 Saying-- 1481 01:05:47,240 --> 01:05:50,130 AUDIENCE: Yeah, it's like, OK, so we have these crystals. 1482 01:05:50,130 --> 01:05:53,466 Their composition is of different particles 1483 01:05:53,466 --> 01:05:58,410 that usually attract each other to make certain easily 1484 01:05:58,410 --> 01:06:00,105 discernible shapes. 1485 01:06:00,105 --> 01:06:00,980 CURRAN KELLEHER: Sure 1486 01:06:00,980 --> 01:06:02,771 AUDIENCE: That's the definition of crystal. 1487 01:06:02,771 --> 01:06:06,540 Once you see it as it is, it's like the set 1488 01:06:06,540 --> 01:06:11,400 of all these different configurations of matter, 1489 01:06:11,400 --> 01:06:13,282 such that there's probably holes. 1490 01:06:13,282 --> 01:06:16,794 You just like-- you're just giving it 1491 01:06:16,794 --> 01:06:18,210 a name [? for some description. ?] 1492 01:06:18,210 --> 01:06:18,860 CURRAN KELLEHER: Exactly. 1493 01:06:18,860 --> 01:06:19,800 You're exactly right. 1494 01:06:19,800 --> 01:06:23,480 So we say crystal, we mean a very high level thing. 1495 01:06:23,480 --> 01:06:25,980 And this is what I mean when I say low level and high level. 1496 01:06:25,980 --> 01:06:30,570 Low level means describing the exact way 1497 01:06:30,570 --> 01:06:34,290 that the particles interact and different things that they do. 1498 01:06:34,290 --> 01:06:36,220 But crystal is a higher level. 1499 01:06:36,220 --> 01:06:38,460 And so the lower level details could be different. 1500 01:06:38,460 --> 01:06:43,560 Crystals form out of all kinds of different substances 1501 01:06:43,560 --> 01:06:44,251 in nature. 1502 01:06:44,251 --> 01:06:45,750 And this-- we can call it a crystal, 1503 01:06:45,750 --> 01:06:48,132 because it sort of resembles the crystals in nature. 1504 01:06:48,132 --> 01:06:49,590 That it has this regular structure. 1505 01:06:49,590 --> 01:06:50,739 But yeah, you're right. 1506 01:06:50,739 --> 01:06:52,530 It's sort of glossing over all the details. 1507 01:06:52,530 --> 01:06:54,110 It's existing at a higher level. 1508 01:06:54,110 --> 01:06:55,745 It's a higher level of description. 1509 01:06:55,745 --> 01:06:58,580 AUDIENCE: So it's like is this mathematical [INAUDIBLE] 1510 01:06:58,580 --> 01:07:00,180 category theory. 1511 01:07:00,180 --> 01:07:03,230 So you can say, OK, we have these different fields 1512 01:07:03,230 --> 01:07:04,750 of everything. 1513 01:07:04,750 --> 01:07:07,230 But what happens over here can be 1514 01:07:07,230 --> 01:07:10,260 mapped to exactly what happens over there. 1515 01:07:10,260 --> 01:07:15,090 When you're describing things as crystals or whatever, 1516 01:07:15,090 --> 01:07:17,130 you're just like [INAUDIBLE] speaking 1517 01:07:17,130 --> 01:07:21,585 category theory language that is just like normal speech. 1518 01:07:21,585 --> 01:07:23,960 CURRAN KELLEHER: Justin knows more about category theory. 1519 01:07:23,960 --> 01:07:25,847 I have no idea about category theory. 1520 01:07:25,847 --> 01:07:26,930 JUSTIN CURRY: Oh, I mean-- 1521 01:07:26,930 --> 01:07:29,130 one, wanted to just to tell you to be careful. 1522 01:07:29,130 --> 01:07:29,790 But two, yes. 1523 01:07:29,790 --> 01:07:30,716 You're kind of right. 1524 01:07:30,716 --> 01:07:35,440 And you're just looking for general features and systems, 1525 01:07:35,440 --> 01:07:39,470 and then ascribing to that universality. 1526 01:07:39,470 --> 01:07:40,870 Exactly. 1527 01:07:40,870 --> 01:07:43,159 I wanted to point out really quickly that what 1528 01:07:43,159 --> 01:07:44,710 I wrote up there on the board-- 1529 01:07:44,710 --> 01:07:48,580 f ij equals k and qi qj over r squared. 1530 01:07:48,580 --> 01:07:53,693 I mean, that's just your rule of attraction for coulomb. 1531 01:07:53,693 --> 01:07:58,660 [INAUDIBLE] And this is how-- 1532 01:07:58,660 --> 01:08:01,480 and we have this level of description 1533 01:08:01,480 --> 01:08:03,790 for the interaction between two particles, 1534 01:08:03,790 --> 01:08:06,900 but what happens when we have n particles? 1535 01:08:06,900 --> 01:08:09,230 Suddenly, the equations become really hard to solve. 1536 01:08:09,230 --> 01:08:12,395 But you start getting really interesting geometric behavior, 1537 01:08:12,395 --> 01:08:16,479 which is easier to describe up top. 1538 01:08:16,479 --> 01:08:20,318 But we're going to hit your idea later, 1539 01:08:20,318 --> 01:08:22,359 because you're going to see examples where we see 1540 01:08:22,359 --> 01:08:25,020 a similar concept to the force. 1541 01:08:25,020 --> 01:08:27,889 But there's no forces going on. 1542 01:08:27,889 --> 01:08:30,653 We'll talk about that traffic flow, things like that. 1543 01:08:30,653 --> 01:08:32,960 We don't actually have cars ramming into each other, 1544 01:08:32,960 --> 01:08:35,632 but we still get the same behavior. 1545 01:08:35,632 --> 01:08:38,590 But we'll save that. 1546 01:08:38,590 --> 01:08:41,010 CURRAN KELLEHER: This is another set of parameters that 1547 01:08:41,010 --> 01:08:43,189 leads to this droplet forming. 1548 01:08:43,189 --> 01:08:45,160 It's pretty cool. 1549 01:08:45,160 --> 01:08:48,840 And here's the n-body problem being simulated. 1550 01:08:48,840 --> 01:08:51,730 Oh, there it is. 1551 01:08:51,730 --> 01:08:54,240 So it's just things orbiting around each other, 1552 01:08:54,240 --> 01:08:57,235 based on pretty much that equation 1553 01:08:57,235 --> 01:08:58,360 that he wrote on the board. 1554 01:08:58,360 --> 01:09:00,526 JUSTIN CURRY: Yeah, except you could replace charges 1555 01:09:00,526 --> 01:09:06,080 with masses, as qi and qj represent charges [INAUDIBLE].. 1556 01:09:06,080 --> 01:09:08,660 CURRAN KELLEHER: Something like that, yeah. 1557 01:09:08,660 --> 01:09:09,310 But that's it. 1558 01:09:09,310 --> 01:09:12,080 AUDIENCE: They have some that's unpredictable, right? 1559 01:09:12,080 --> 01:09:12,729 CURRAN KELLEHER: Yeah, this is-- 1560 01:09:12,729 --> 01:09:14,312 AUDIENCE: Actually, it is predictable, 1561 01:09:14,312 --> 01:09:15,627 but you have to [INAUDIBLE]. 1562 01:09:15,627 --> 01:09:17,710 JUSTIN CURRY: It's deterministic, not predictable. 1563 01:09:17,710 --> 01:09:18,779 CURRAN KELLEHER: It's deterministic 1564 01:09:18,779 --> 01:09:19,612 and not predictable. 1565 01:09:19,612 --> 01:09:23,790 It's chaotic, because it becomes nonlinear. 1566 01:09:23,790 --> 01:09:24,790 Is that correct, Justin? 1567 01:09:24,790 --> 01:09:27,123 AUDIENCE: So the only way to know what's going to happen 1568 01:09:27,123 --> 01:09:27,748 is [INAUDIBLE]. 1569 01:09:27,748 --> 01:09:28,664 CURRAN KELLEHER: Yeah. 1570 01:09:28,664 --> 01:09:30,380 And even then, you're approximating. 1571 01:09:30,380 --> 01:09:31,500 It's not continuous. 1572 01:09:31,500 --> 01:09:33,759 So you're going to get an approximation of what 1573 01:09:33,759 --> 01:09:34,300 might happen. 1574 01:09:36,920 --> 01:09:38,950 Yeah, you can't really solve the n-body problem. 1575 01:09:38,950 --> 01:09:42,540 AUDIENCE: So what if you've got your mind, right? 1576 01:09:42,540 --> 01:09:45,870 So what if-- if it is in different bunch of problems, 1577 01:09:45,870 --> 01:09:48,970 even in that thing and you're only talking about particles. 1578 01:09:48,970 --> 01:09:50,120 Those are simple stuff. 1579 01:09:50,120 --> 01:09:51,540 But when you have minds, you have 1580 01:09:51,540 --> 01:09:53,160 a bunch of these steps going off together. 1581 01:09:53,160 --> 01:09:54,090 CURRAN KELLEHER: When you have what? 1582 01:09:54,090 --> 01:09:55,090 AUDIENCE: When you have minds. 1583 01:09:55,090 --> 01:09:56,280 CURRAN KELLEHER: Minds? 1584 01:09:56,280 --> 01:09:56,480 AUDIENCE: Yeah. 1585 01:09:56,480 --> 01:09:57,396 CURRAN KELLEHER: Yeah. 1586 01:09:57,396 --> 01:10:00,210 So when you consider a mind to be 1587 01:10:00,210 --> 01:10:02,910 one of these particles in society, 1588 01:10:02,910 --> 01:10:06,150 interacting with all the other minds around it, 1589 01:10:06,150 --> 01:10:07,870 this is called agent based modeling. 1590 01:10:07,870 --> 01:10:09,270 This is agent based modeling. 1591 01:10:09,270 --> 01:10:11,160 Flocking is when you have-- 1592 01:10:11,160 --> 01:10:13,290 AUDIENCE: But the problem is for the particles, 1593 01:10:13,290 --> 01:10:14,805 you have different rules. 1594 01:10:14,805 --> 01:10:17,430 But for societies, different than minds themselves, 1595 01:10:17,430 --> 01:10:19,619 are hard to predict. 1596 01:10:19,619 --> 01:10:20,910 CURRAN KELLEHER: Yeah, exactly. 1597 01:10:20,910 --> 01:10:25,110 So you're saying if we model society 1598 01:10:25,110 --> 01:10:29,100 as this agent based model, where each agent consists of a mind, 1599 01:10:29,100 --> 01:10:33,390 it's even more impossible to predict because the mind itself 1600 01:10:33,390 --> 01:10:34,680 is not really predictable. 1601 01:10:34,680 --> 01:10:39,030 The mind itself is emergent out of the things that comprise it. 1602 01:10:39,030 --> 01:10:40,812 So yeah, I mean, that's-- 1603 01:10:40,812 --> 01:10:42,520 JUSTIN CURRY: It adds an interesting idea 1604 01:10:42,520 --> 01:10:45,010 of going backwards, starting with behavior 1605 01:10:45,010 --> 01:10:47,676 and then trying to figure out how the [? world governs ?] it. 1606 01:10:47,676 --> 01:10:50,400 So it's kind of like, how did Newton figure out 1607 01:10:50,400 --> 01:10:53,010 this Law of Attraction? 1608 01:10:53,010 --> 01:10:54,640 He started from the behavior and tried 1609 01:10:54,640 --> 01:10:59,060 to deduce or infer, better yet, what the law between two things 1610 01:10:59,060 --> 01:11:00,305 is. 1611 01:11:00,305 --> 01:11:05,244 And [INAUDIBLE] but cars, and I guess even with people, 1612 01:11:05,244 --> 01:11:06,410 is we don't know the people. 1613 01:11:06,410 --> 01:11:08,451 We don't know the interaction between two people, 1614 01:11:08,451 --> 01:11:12,451 but we see the overall [? behave. ?] 1615 01:11:12,451 --> 01:11:14,200 CURRAN KELLEHER: So here's another version 1616 01:11:14,200 --> 01:11:18,130 of that program, where you can get these molecules 1617 01:11:18,130 --> 01:11:20,680 to form, which is really fascinating, 1618 01:11:20,680 --> 01:11:23,230 these stringing molecules. 1619 01:11:23,230 --> 01:11:26,810 So it's another kind of emergent behavior. 1620 01:11:26,810 --> 01:11:29,110 See, there they go. 1621 01:11:29,110 --> 01:11:30,610 It's totally a molecule. 1622 01:11:30,610 --> 01:11:31,526 Check it out. 1623 01:11:34,035 --> 01:11:35,660 And the rules are pretty much the same. 1624 01:11:35,660 --> 01:11:40,980 I don't know exactly what particular rules they are. 1625 01:11:40,980 --> 01:11:44,150 And here it is in 3D, also. 1626 01:11:44,150 --> 01:11:46,310 So if we wait for a minute, these molecules 1627 01:11:46,310 --> 01:11:49,730 are going to form in 3D. 1628 01:11:49,730 --> 01:11:58,140 So I mean, emergence is really a cool concept. 1629 01:11:58,140 --> 01:12:01,277 So here is this molecule in 3D. 1630 01:12:01,277 --> 01:12:03,360 AUDIENCE: Could it be that the world is constantly 1631 01:12:03,360 --> 01:12:06,179 describing itself in different levels of description? 1632 01:12:06,179 --> 01:12:07,970 CURRAN KELLEHER: Could it be that the world 1633 01:12:07,970 --> 01:12:10,100 is describing itself in levels, different levels 1634 01:12:10,100 --> 01:12:11,186 of description? 1635 01:12:11,186 --> 01:12:14,020 AUDIENCE: Yeah, because you do have that, and then you also 1636 01:12:14,020 --> 01:12:20,652 have something higher that [INAUDIBLE] something 1637 01:12:20,652 --> 01:12:21,360 like that. 1638 01:12:21,360 --> 01:12:25,365 But you can also describe it as a bunch of different particles 1639 01:12:25,365 --> 01:12:27,405 [INAUDIBLE]. 1640 01:12:27,405 --> 01:12:29,575 You can also describe it as somebody 1641 01:12:29,575 --> 01:12:31,321 actually looked at that thing itself. 1642 01:12:31,321 --> 01:12:33,320 JUSTIN CURRY: Yeah, it was doing the describing. 1643 01:12:33,320 --> 01:12:34,670 AUDIENCE: Yeah, it was. 1644 01:12:34,670 --> 01:12:36,440 CURRAN KELLEHER: Yeah, I mean, it's 1645 01:12:36,440 --> 01:12:42,860 what Douglas Hofstadter calls a tangled hierarchy, where 1646 01:12:42,860 --> 01:12:44,150 it's not really a hierarchy. 1647 01:12:44,150 --> 01:12:47,090 Because things on the lower levels 1648 01:12:47,090 --> 01:12:49,370 are related to things on higher levels, and can have 1649 01:12:49,370 --> 01:12:51,270 influence back and forth. 1650 01:12:51,270 --> 01:12:55,400 So just like you said, the world is constantly describing 1651 01:12:55,400 --> 01:12:59,690 itself, interacting with itself between different layers, 1652 01:12:59,690 --> 01:13:01,090 different levels. 1653 01:13:01,090 --> 01:13:02,110 And it's just like-- 1654 01:13:02,110 --> 01:13:05,952 AUDIENCE: Can we say that the higher level also-- 1655 01:13:05,952 --> 01:13:08,980 can it point us to the lower level, or can it-- it was just 1656 01:13:08,980 --> 01:13:10,520 lower to [INAUDIBLE]. 1657 01:13:10,520 --> 01:13:12,080 CURRAN KELLEHER: So you said, can we 1658 01:13:12,080 --> 01:13:14,288 say that the higher level influences the lower level? 1659 01:13:14,288 --> 01:13:16,010 Or does it always go up? 1660 01:13:16,010 --> 01:13:18,440 Well, no, it definitely does not go only upward, 1661 01:13:18,440 --> 01:13:20,360 because think of software. 1662 01:13:20,360 --> 01:13:24,060 If I run this program, this next program-- 1663 01:13:24,060 --> 01:13:25,910 which we'll just watch for a while-- 1664 01:13:25,910 --> 01:13:28,680 this program itself is controlling 1665 01:13:28,680 --> 01:13:31,370 the transistors and whatnot that's operating on-- 1666 01:13:31,370 --> 01:13:34,210 AUDIENCE: But it's not really controlling the transistors. 1667 01:13:34,210 --> 01:13:35,380 CURRAN KELLEHER: It's not really controlling it? 1668 01:13:35,380 --> 01:13:36,421 AUDIENCE: No, no, no, no. 1669 01:13:36,421 --> 01:13:37,190 Look, look! 1670 01:13:37,190 --> 01:13:38,840 You've got the code. 1671 01:13:38,840 --> 01:13:40,870 You put it into the RAM. 1672 01:13:40,870 --> 01:13:43,750 There's something that's represented 1673 01:13:43,750 --> 01:13:45,620 by different electrical activities. 1674 01:13:45,620 --> 01:13:48,335 And those electrical activities are computed 1675 01:13:48,335 --> 01:13:49,520 and they're different. 1676 01:13:49,520 --> 01:13:54,260 Like, hardware is [INAUDIBLE] much 1677 01:13:54,260 --> 01:13:56,106 like different hardware they had here. 1678 01:13:56,106 --> 01:13:59,215 And then it had [INAUDIBLE]. 1679 01:13:59,215 --> 01:14:03,510 There's no higher thing [INAUDIBLE].. 1680 01:14:03,510 --> 01:14:06,090 CURRAN KELLEHER: Yeah, it's all one. 1681 01:14:06,090 --> 01:14:08,089 It's all one thing. 1682 01:14:08,089 --> 01:14:09,630 It's just, we think about it in terms 1683 01:14:09,630 --> 01:14:11,184 of high level and low level. 1684 01:14:11,184 --> 01:14:11,850 So you're right. 1685 01:14:11,850 --> 01:14:15,580 I mean, the hardware is controlling itself. 1686 01:14:15,580 --> 01:14:18,070 But it's based on what we put into it, 1687 01:14:18,070 --> 01:14:21,910 and what we say-- tell it to do at a higher level. 1688 01:14:21,910 --> 01:14:25,280 So I mean, the software that I wrote, which came from my mind, 1689 01:14:25,280 --> 01:14:28,950 it's at a higher level, in a sense, than the hardware. 1690 01:14:28,950 --> 01:14:31,425 Like this projector that's actually projecting pixels. 1691 01:14:34,760 --> 01:14:37,170 So that's what I mean when I say the higher level things 1692 01:14:37,170 --> 01:14:40,680 influence, affect, control, even, in this case, the lower 1693 01:14:40,680 --> 01:14:41,530 level things. 1694 01:14:41,530 --> 01:14:43,480 It goes both ways. 1695 01:14:43,480 --> 01:14:45,854 And if a transistor were to crap out right now, 1696 01:14:45,854 --> 01:14:47,520 it would control the higher level things 1697 01:14:47,520 --> 01:14:50,340 because it would just stop working. 1698 01:14:50,340 --> 01:14:52,299 JUSTIN CURRY: It's constant causality, I guess. 1699 01:14:52,299 --> 01:14:53,673 CURRAN KELLEHER: Causality goes-- 1700 01:14:53,673 --> 01:14:55,260 AUDIENCE: That doesn't make any sense. 1701 01:14:55,260 --> 01:14:59,110 OK, how can a software program control hardware? 1702 01:14:59,110 --> 01:15:00,920 CURRAN KELLEHER: So this software program 1703 01:15:00,920 --> 01:15:03,660 is running on hardware. 1704 01:15:03,660 --> 01:15:06,032 And it's controlling this projector. 1705 01:15:06,032 --> 01:15:07,740 It's controlling the hardware, physical-- 1706 01:15:07,740 --> 01:15:09,490 AUDIENCE: OK, is it telling you what to do 1707 01:15:09,490 --> 01:15:11,510 or is the hardware telling itself what to do? 1708 01:15:11,510 --> 01:15:13,968 CURRAN KELLEHER: The hardware is telling itself what to do, 1709 01:15:13,968 --> 01:15:16,205 but only after I've told it what to do. 1710 01:15:16,205 --> 01:15:19,135 JUSTIN CURRY: Right, but even if we were to remove Curran, 1711 01:15:19,135 --> 01:15:25,020 and we just had the software executing by itself, 1712 01:15:25,020 --> 01:15:28,180 you're right in the sense that it's always just the hardware. 1713 01:15:28,180 --> 01:15:29,470 It's just the hardware, right? 1714 01:15:29,470 --> 01:15:36,170 The software doesn't really exist in this ethereal realm. 1715 01:15:36,170 --> 01:15:37,770 Right? 1716 01:15:37,770 --> 01:15:40,350 Software is still just what's going 1717 01:15:40,350 --> 01:15:42,670 on on the level of transistors and stuff. 1718 01:15:42,670 --> 01:15:48,380 It's just that we as humans use these levels of description. 1719 01:15:48,380 --> 01:15:51,380 And we talk about the software as its own entity, 1720 01:15:51,380 --> 01:15:55,400 even though it's fundamentally still just caused 1721 01:15:55,400 --> 01:15:59,421 by the interactions of electrons and transistors, and such. 1722 01:15:59,421 --> 01:16:01,170 CURRAN KELLEHER: So here's another example 1723 01:16:01,170 --> 01:16:03,110 with some traffic flow. 1724 01:16:03,110 --> 01:16:07,860 So traffic flow is a perfect example of emergence, 1725 01:16:07,860 --> 01:16:10,870 and what it means. 1726 01:16:10,870 --> 01:16:14,670 So like these little bars, it's just like one lane of traffic 1727 01:16:14,670 --> 01:16:16,080 that's just going. 1728 01:16:16,080 --> 01:16:18,810 And it repeats itself. 1729 01:16:18,810 --> 01:16:21,290 Each one of these cars is like an agent. 1730 01:16:21,290 --> 01:16:25,890 It's a lower level of description than the waves 1731 01:16:25,890 --> 01:16:28,230 that we'll see right here. 1732 01:16:28,230 --> 01:16:30,300 So say there's a red light, right. 1733 01:16:30,300 --> 01:16:32,130 So the traffic gets backed up a little bit, 1734 01:16:32,130 --> 01:16:34,520 and then the light turns green again. 1735 01:16:34,520 --> 01:16:35,457 And then they go. 1736 01:16:35,457 --> 01:16:37,290 And so you see this thing on a higher level. 1737 01:16:37,290 --> 01:16:40,260 It's this wave, which is propagating back. 1738 01:16:40,260 --> 01:16:44,160 So we say the wave is a thing, and we 1739 01:16:44,160 --> 01:16:45,720 can describe it as an entity. 1740 01:16:45,720 --> 01:16:48,340 But it's at a higher level than the cars themselves, 1741 01:16:48,340 --> 01:16:51,330 even though it's comprised of the cars, it affects the cars, 1742 01:16:51,330 --> 01:16:53,340 and the cars affect it. 1743 01:16:53,340 --> 01:16:57,592 It's the same thing with software and hardware. 1744 01:16:57,592 --> 01:16:59,050 JUSTIN CURRY: I still don't get it. 1745 01:16:59,050 --> 01:17:00,646 AUDIENCE: You still don't get it? 1746 01:17:00,646 --> 01:17:02,020 JUSTIN CURRY: I guess it's really 1747 01:17:02,020 --> 01:17:04,342 just an issue of reductionism versus holism, right? 1748 01:17:04,342 --> 01:17:06,050 CURRAN KELLEHER: Yeah, which is exactly-- 1749 01:17:06,050 --> 01:17:07,514 So you feel like everything can be 1750 01:17:07,514 --> 01:17:10,853 understood as reductionist thinking, 1751 01:17:10,853 --> 01:17:12,150 or sometimes we have to-- 1752 01:17:12,150 --> 01:17:13,566 AUDIENCE: Well, at times, you just 1753 01:17:13,566 --> 01:17:17,090 have to think, OK, how does, like, me, 1754 01:17:17,090 --> 01:17:20,644 for instance, how do I come from interactions of particles? 1755 01:17:20,644 --> 01:17:24,647 You have to do synthesis and [INAUDIBLE].. 1756 01:17:24,647 --> 01:17:25,980 JUSTIN CURRY: But that doesn't-- 1757 01:17:25,980 --> 01:17:27,866 does that explain you? 1758 01:17:27,866 --> 01:17:30,410 And Hofstadter asks another question in the same chapter-- 1759 01:17:30,410 --> 01:17:35,920 is, the guy who runs the 100 meters in 9.3 seconds, 1760 01:17:35,920 --> 01:17:37,260 where is the 9.3 stored? 1761 01:17:41,130 --> 01:17:41,950 It's not, right? 1762 01:17:41,950 --> 01:17:43,860 AUDIENCE: It's encoded in his brain. 1763 01:17:43,860 --> 01:17:47,600 And what he calls those brain signals is like 9.3. 1764 01:17:47,600 --> 01:17:49,090 That's what it says. 1765 01:17:49,090 --> 01:17:51,100 JUSTIN CURRY: Right, no, but 9.3-- 1766 01:17:51,100 --> 01:17:54,310 the fact that he ran the 100 meters in 9.3 seconds 1767 01:17:54,310 --> 01:17:59,740 was the result of training, him getting good traction 1768 01:17:59,740 --> 01:18:00,580 at the start. 1769 01:18:00,580 --> 01:18:02,750 It was really an emergent thing. 1770 01:18:02,750 --> 01:18:05,055 It's an epiphenomenon, as Hofstadter calls it. 1771 01:18:09,285 --> 01:18:10,910 CURRAN KELLEHER: Here's a good question 1772 01:18:10,910 --> 01:18:13,145 to ask yourself regarding things that are emergent. 1773 01:18:16,670 --> 01:18:17,365 Do you exist? 1774 01:18:20,460 --> 01:18:22,316 Because, what are you? 1775 01:18:22,316 --> 01:18:26,490 Have you ever asked yourself, like, what am I? 1776 01:18:26,490 --> 01:18:29,970 It's an area where it's easily confusing, 1777 01:18:29,970 --> 01:18:34,680 because you're an emergent property, from the things 1778 01:18:34,680 --> 01:18:36,644 that you're made of. 1779 01:18:36,644 --> 01:18:40,620 AUDIENCE: The problem is it's not to say, do I exist. 1780 01:18:40,620 --> 01:18:44,515 The answer to that question is almost the same answer 1781 01:18:44,515 --> 01:18:46,470 to the question of what am I? 1782 01:18:46,470 --> 01:18:47,716 If I describe myself as-- 1783 01:18:51,055 --> 01:18:53,665 there's like almost two different parts of myself. 1784 01:18:53,665 --> 01:18:59,870 Now, [INAUDIBLE] myself as described by prim processes, 1785 01:18:59,870 --> 01:19:02,900 maybe like some part of my brain has a description 1786 01:19:02,900 --> 01:19:05,990 of what I've done, I've always done, and also my beliefs 1787 01:19:05,990 --> 01:19:06,742 about myself. 1788 01:19:06,742 --> 01:19:08,200 And also, another part of the brain 1789 01:19:08,200 --> 01:19:14,570 has almost a way of going back, and also maybe having 1790 01:19:14,570 --> 01:19:18,340 this self conscious awareness of like seeing the world, 1791 01:19:18,340 --> 01:19:20,854 leaving my description of myself. 1792 01:19:20,854 --> 01:19:21,770 CURRAN KELLEHER: Yeah. 1793 01:19:21,770 --> 01:19:23,430 AUDIENCE: So which one am I? 1794 01:19:23,430 --> 01:19:28,070 Well, probably I am the thing that the brain describes. 1795 01:19:28,070 --> 01:19:29,660 I am the self. 1796 01:19:29,660 --> 01:19:32,540 I am not the thing that's aware of the self. 1797 01:19:32,540 --> 01:19:36,350 Because the thing that's aware of the self only 1798 01:19:36,350 --> 01:19:38,692 exists to be aware of that self. 1799 01:19:38,692 --> 01:19:39,900 So I cannot be the awareness. 1800 01:19:39,900 --> 01:19:43,690 I must be the one that's being aware, if that makes sense. 1801 01:19:43,690 --> 01:19:45,370 CURRAN KELLEHER: So you're basically 1802 01:19:45,370 --> 01:19:49,220 dividing yourself into the observer and the observed. 1803 01:19:49,220 --> 01:19:49,954 Right? 1804 01:19:49,954 --> 01:19:50,620 The thing that-- 1805 01:19:50,620 --> 01:19:51,911 AUDIENCE: I'm not the observer. 1806 01:19:51,911 --> 01:19:53,614 I'm the one that's being observed. 1807 01:19:53,614 --> 01:19:56,030 CURRAN KELLEHER: So you are the one that's being observed? 1808 01:19:56,030 --> 01:19:58,080 So what is it that's observing that? 1809 01:19:58,080 --> 01:19:58,840 It's not you? 1810 01:19:58,840 --> 01:20:01,370 AUDIENCE: What it is is like this added 1811 01:20:01,370 --> 01:20:03,308 structure of self awareness. 1812 01:20:06,520 --> 01:20:08,860 CURRAN KELLEHER: So that's a-- 1813 01:20:08,860 --> 01:20:11,761 these are the issues that Buddhism grapples with. 1814 01:20:16,618 --> 01:20:17,118 And-- 1815 01:20:17,118 --> 01:20:19,470 AUDIENCE: How does [INAUDIBLE]? 1816 01:20:19,470 --> 01:20:20,970 CURRAN KELLEHER: Well, I don't know. 1817 01:20:20,970 --> 01:20:23,130 I'm not enlightened. 1818 01:20:23,130 --> 01:20:24,720 I don't know the answers. 1819 01:20:24,720 --> 01:20:26,220 So I'm still as confused as you are. 1820 01:20:26,220 --> 01:20:28,261 AUDIENCE: I really think one of the main problems 1821 01:20:28,261 --> 01:20:31,580 is that you think too much. 1822 01:20:31,580 --> 01:20:35,520 If humans just stopped thinking, everything will just be fine. 1823 01:20:35,520 --> 01:20:37,490 CURRAN KELLEHER: So Atif just said 1824 01:20:37,490 --> 01:20:40,740 if humans just stop thinking, everything will be fine. 1825 01:20:40,740 --> 01:20:43,066 And that's what the zen masters say, also. 1826 01:20:43,066 --> 01:20:46,272 JUSTIN CURRY: Well, it's also saying we should just be 1827 01:20:46,272 --> 01:20:49,740 [? featuring it. ?] Which also just says, 1828 01:20:49,740 --> 01:20:52,850 let's give everybody mandatory frontal lobotomies, 1829 01:20:52,850 --> 01:20:55,178 so no one can think any abstract thoughts 1830 01:20:55,178 --> 01:20:56,612 or get worried about anything. 1831 01:20:56,612 --> 01:20:59,480 And we'll just be reduced to basic hunting and surviving, 1832 01:20:59,480 --> 01:21:00,810 if we can and-- 1833 01:21:00,810 --> 01:21:02,410 AUDIENCE: It's saying what Godel says. 1834 01:21:02,410 --> 01:21:04,740 Once you start talking to yourself, 1835 01:21:04,740 --> 01:21:06,534 you're speaking nonsense basically. 1836 01:21:06,534 --> 01:21:07,950 JUSTIN CURRY: If you start talking 1837 01:21:07,950 --> 01:21:09,199 to yourself or about yourself? 1838 01:21:09,199 --> 01:21:11,000 AUDIENCE: Talking about yourself. 1839 01:21:11,000 --> 01:21:15,780 It's like, OK, here's me, if that makes any sense, 1840 01:21:15,780 --> 01:21:17,700 because who's speaking? 1841 01:21:17,700 --> 01:21:19,760 I'm talking about myself. 1842 01:21:19,760 --> 01:21:20,666 JUSTIN CURRY: Yes. 1843 01:21:20,666 --> 01:21:23,520 AUDIENCE: My self-reflective process that I'm talking about 1844 01:21:23,520 --> 01:21:25,366 myself-- 1845 01:21:25,366 --> 01:21:27,240 some of the most abstract process of my mind. 1846 01:21:27,240 --> 01:21:28,864 That means they don't know the details, 1847 01:21:28,864 --> 01:21:31,682 therefore they should not really be talking about myself. 1848 01:21:31,682 --> 01:21:33,140 JUSTIN CURRY: So you're essentially 1849 01:21:33,140 --> 01:21:36,630 arguing that self-reference isn't a well formed thought? 1850 01:21:36,630 --> 01:21:37,924 AUDIENCE: No, no. 1851 01:21:37,924 --> 01:21:38,632 JUSTIN CURRY: No? 1852 01:21:38,632 --> 01:21:39,352 AUDIENCE: Yeah. 1853 01:21:39,352 --> 01:21:40,060 JUSTIN CURRY: OK. 1854 01:21:42,920 --> 01:21:45,980 CURRAN KELLEHER: Yeah, I mean that's true. 1855 01:21:45,980 --> 01:21:50,210 When you talk about yourself, the thing that's talking 1856 01:21:50,210 --> 01:21:55,040 is not well-informed about what it's actually talking about. 1857 01:21:55,040 --> 01:21:55,770 Right? 1858 01:21:55,770 --> 01:21:56,270 So-- 1859 01:21:56,270 --> 01:21:57,853 AUDIENCE: [? Respect ?] the complexity 1860 01:21:57,853 --> 01:22:00,104 of the brain because the brain is so complex that it 1861 01:22:00,104 --> 01:22:01,937 can't even talk about itself without talking 1862 01:22:01,937 --> 01:22:02,925 about the nonsense. 1863 01:22:02,925 --> 01:22:05,004 It's just almost like looking around, 1864 01:22:05,004 --> 01:22:07,170 like on a weird [INAUDIBLE],, on the weird patterns. 1865 01:22:07,170 --> 01:22:09,720 Like, oh, this may be, and this may be, this may be. 1866 01:22:09,720 --> 01:22:12,327 It's like recognizing [INAUDIBLE].. 1867 01:22:12,327 --> 01:22:14,160 CURRAN KELLEHER: Yeah, so essentially you're 1868 01:22:14,160 --> 01:22:17,677 saying talking about yourself is just rambling nonsense? 1869 01:22:17,677 --> 01:22:19,260 AUDIENCE: It's sort of like intuition, 1870 01:22:19,260 --> 01:22:20,860 like a mathematical intuition. 1871 01:22:20,860 --> 01:22:23,694 You see some structure here, some structure there. 1872 01:22:23,694 --> 01:22:25,060 You say, oh, yeah. 1873 01:22:25,060 --> 01:22:27,370 That [? may ?] [? replace ?] some mathematicals 1874 01:22:27,370 --> 01:22:28,560 or something. 1875 01:22:28,560 --> 01:22:31,290 But it's just a hypothesis, and it's not really-- 1876 01:22:31,290 --> 01:22:34,064 it doesn't have to be true. 1877 01:22:34,064 --> 01:22:35,730 CURRAN KELLEHER: Yeah, so you're getting 1878 01:22:35,730 --> 01:22:40,050 at some really deep questions that we all have to face, 1879 01:22:40,050 --> 01:22:41,690 if we choose to face them. 1880 01:22:41,690 --> 01:22:44,130 So I mean, I encourage you to keep-- 1881 01:22:44,130 --> 01:22:47,579 JUSTIN CURRY: Or as Sandra says, stop thinking about them. 1882 01:22:47,579 --> 01:22:49,620 CURRAN KELLEHER: Yeah, I mean, that's totally it. 1883 01:22:49,620 --> 01:22:51,750 Because if we keep thinking about these things-- 1884 01:22:51,750 --> 01:22:52,980 AUDIENCE: We'll get stuck. 1885 01:22:52,980 --> 01:22:55,313 CURRAN KELLEHER: --either we'll get stuck and go insane, 1886 01:22:55,313 --> 01:22:58,560 and not be able to function, or we'll become enlightened maybe. 1887 01:22:58,560 --> 01:23:00,656 I mean, I don't know what that is, even. 1888 01:23:00,656 --> 01:23:04,097 JUSTIN CURRY: Yeah, I mean, be bold, but not too bold. 1889 01:23:04,097 --> 01:23:06,180 CURRAN KELLEHER: And this gets back to what Justin 1890 01:23:06,180 --> 01:23:08,370 was saying about earlier. 1891 01:23:08,370 --> 01:23:11,240 Introspecting and asking these questions 1892 01:23:11,240 --> 01:23:16,930 is not favorable to our survival. 1893 01:23:16,930 --> 01:23:20,220 So in a sense, we've been programmed to just ignore 1894 01:23:20,220 --> 01:23:22,625 them, and just live our lives. 1895 01:23:22,625 --> 01:23:24,900 JUSTIN CURRY: But we're hoping to evolve past that. 1896 01:23:24,900 --> 01:23:27,986 CURRAN KELLEHER: But, yeah, we're hoping to transcend that. 1897 01:23:27,986 --> 01:23:30,110 AUDIENCE: Perhaps when we build a conscious machine 1898 01:23:30,110 --> 01:23:33,200 or something like that, we shouldn't give it the ability 1899 01:23:33,200 --> 01:23:35,800 to analyze this stuff. 1900 01:23:35,800 --> 01:23:39,190 But maybe the worst thing any person can ever do 1901 01:23:39,190 --> 01:23:42,850 is to actually know how they think, 1902 01:23:42,850 --> 01:23:45,700 who they actually really are. 1903 01:23:45,700 --> 01:23:50,260 I think HP Lovecraft once said the same thing about that. 1904 01:23:50,260 --> 01:23:54,510 The best thing that [? women, ?] we, I think, ever have 1905 01:23:54,510 --> 01:23:58,030 is that at any given time, we don't ever 1906 01:23:58,030 --> 01:24:00,490 have a complete model of the world. 1907 01:24:00,490 --> 01:24:05,950 Because if we ever did, that's so frightening. 1908 01:24:05,950 --> 01:24:07,450 JUSTIN CURRY: I mean-- 1909 01:24:07,450 --> 01:24:08,650 but exactly. 1910 01:24:08,650 --> 01:24:10,990 On the other hand, though, we're inherently curious 1911 01:24:10,990 --> 01:24:14,296 and we inherently want to understand bits of things. 1912 01:24:14,296 --> 01:24:16,504 And I don't know if we can ever have a complete model 1913 01:24:16,504 --> 01:24:19,752 of the world, as it is. 1914 01:24:19,752 --> 01:24:23,490 But we can sure as heck try. 1915 01:24:23,490 --> 01:24:26,649 And bottom line is you should just 1916 01:24:26,649 --> 01:24:28,690 do whatever makes you feel good. if understanding 1917 01:24:28,690 --> 01:24:31,555 part of the brain does that, then go for it. 1918 01:24:31,555 --> 01:24:34,620 AUDIENCE: Well, you can never fully understand the brain. 1919 01:24:34,620 --> 01:24:37,210 JUSTIN CURRY: Don't let that stop you from trying. 1920 01:24:37,210 --> 01:24:38,630 AUDIENCE: Come on, look. 1921 01:24:38,630 --> 01:24:40,810 It's like one of those existentialists. 1922 01:24:40,810 --> 01:24:42,948 You got a man [? holding ?] this stone 1923 01:24:42,948 --> 01:24:46,942 up a mountain, or something, wants to get the stones dropped 1924 01:24:46,942 --> 01:24:47,900 and then go back again. 1925 01:24:47,900 --> 01:24:50,623 You know, the goal is point [INAUDIBLE] doing the process. 1926 01:24:50,623 --> 01:24:52,456 JUSTIN CURRY: So, I mean, you're essentially 1927 01:24:52,456 --> 01:24:56,060 arguing that understanding the brain is a pointless goal? 1928 01:24:56,060 --> 01:24:59,296 AUDIENCE: Why do something that you can never achieve? 1929 01:24:59,296 --> 01:25:00,294 JUSTIN CURRY: Yeah. 1930 01:25:00,294 --> 01:25:04,290 AUDIENCE: So even if you achieve it, what's the next step? 1931 01:25:04,290 --> 01:25:06,776 Oh, you'll make yourself better. 1932 01:25:06,776 --> 01:25:09,920 JUSTIN CURRY: OK, so yeah. 1933 01:25:09,920 --> 01:25:10,864 I don't know. 1934 01:25:10,864 --> 01:25:13,230 I'm going with Sandra's idea. 1935 01:25:13,230 --> 01:25:15,026 CURRAN KELLEHER: What was Sandra's idea? 1936 01:25:15,026 --> 01:25:16,651 AUDIENCE: Like, there's puzzles, right? 1937 01:25:16,651 --> 01:25:18,256 If you finish the puzzle, you're done. 1938 01:25:18,256 --> 01:25:18,755 Right? 1939 01:25:18,755 --> 01:25:20,280 So what else is next? 1940 01:25:20,280 --> 01:25:22,680 AUDIENCE: You make yourself smarter so you can be harder 1941 01:25:22,680 --> 01:25:25,879 to understand your own self. 1942 01:25:25,879 --> 01:25:30,370 AUDIENCE: It's like essentially skirting the next [INAUDIBLE].. 1943 01:25:30,370 --> 01:25:33,240 CURRAN KELLEHER: So what you're getting at, actually-- 1944 01:25:33,240 --> 01:25:39,630 because what you're saying is sort of like this. 1945 01:25:39,630 --> 01:25:44,310 Once you understand a certain amount of yourself, 1946 01:25:44,310 --> 01:25:47,670 you've added another part of yourself 1947 01:25:47,670 --> 01:25:50,994 which you don't understand. 1948 01:25:50,994 --> 01:25:52,702 AUDIENCE: You mean the part that actually 1949 01:25:52,702 --> 01:25:55,097 understands itself is the part that you actually 1950 01:25:55,097 --> 01:25:56,140 don't understand? 1951 01:25:56,140 --> 01:25:57,056 CURRAN KELLEHER: Yeah. 1952 01:25:57,056 --> 01:26:00,720 So once you've come to terms with yourself, 1953 01:26:00,720 --> 01:26:02,880 or so you think, that part of you, 1954 01:26:02,880 --> 01:26:05,190 which has just come to terms with itself, 1955 01:26:05,190 --> 01:26:06,780 you don't understand. 1956 01:26:06,780 --> 01:26:10,680 That is analogous to adding g to number theory, 1957 01:26:10,680 --> 01:26:14,640 because it can be Godel-ized again and be proved incomplete 1958 01:26:14,640 --> 01:26:15,427 yet again. 1959 01:26:15,427 --> 01:26:17,760 And then you can say, oh, well, it's not incomplete now. 1960 01:26:17,760 --> 01:26:19,140 So I can just add that in. 1961 01:26:19,140 --> 01:26:21,780 This is a big part of some chapter in Godel, Escher, Bach. 1962 01:26:21,780 --> 01:26:23,100 I forget which one. 1963 01:26:23,100 --> 01:26:28,710 But it's essentially, you can never understand yourself. 1964 01:26:28,710 --> 01:26:32,010 I mean, it's a dangerous use of Godel's Incompleteness Theorem. 1965 01:26:32,010 --> 01:26:36,560 And it's probably not well based. 1966 01:26:36,560 --> 01:26:39,290 But it's something to think about. 1967 01:26:39,290 --> 01:26:43,620 AUDIENCE: [INAUDIBLE] do are like the complexity. 1968 01:26:43,620 --> 01:26:48,150 I think it was some guy that-- he did some calculations. 1969 01:26:48,150 --> 01:26:51,960 And it came out, once a system becomes so complicated, 1970 01:26:51,960 --> 01:26:56,048 he reaches a point that it can't understand itself. 1971 01:26:56,048 --> 01:26:58,547 JUSTIN CURRY: Eventually-- if you could pull that reference, 1972 01:26:58,547 --> 01:27:00,038 that would be good to see. 1973 01:27:00,038 --> 01:27:02,024 But, yeah, [INAUDIBLE]. 1974 01:27:02,024 --> 01:27:03,898 AUDIENCE: I think somebody all ready told me. 1975 01:27:03,898 --> 01:27:06,590 I don't know who that is. 1976 01:27:06,590 --> 01:27:08,984 CURRAN KELLEHER: Yeah, I mean once a system gets 1977 01:27:08,984 --> 01:27:10,400 to a certain point of complexity-- 1978 01:27:10,400 --> 01:27:12,983 AUDIENCE: Maybe if somebody else can't understand it happening 1979 01:27:12,983 --> 01:27:16,864 and [? understand itself. ?] 1980 01:27:16,864 --> 01:27:17,780 CURRAN KELLEHER: Yeah. 1981 01:27:17,780 --> 01:27:19,160 I mean, we can't understand ourselves 1982 01:27:19,160 --> 01:27:20,990 in the sense of everything that's going on, 1983 01:27:20,990 --> 01:27:23,450 because we can't introspect on our own neurons. 1984 01:27:23,450 --> 01:27:25,250 Right? 1985 01:27:25,250 --> 01:27:27,830 Our neurons, in the case of agent based models, 1986 01:27:27,830 --> 01:27:29,169 like we're talking about-- 1987 01:27:29,169 --> 01:27:30,210 like, check this one out. 1988 01:27:30,210 --> 01:27:33,170 This is pretty cool. 1989 01:27:33,170 --> 01:27:35,240 The balls on top are being pulled upwards, 1990 01:27:35,240 --> 01:27:38,479 and the ones here are being pulled downwards. 1991 01:27:38,479 --> 01:27:39,770 So there's a sort of stability. 1992 01:27:39,770 --> 01:27:42,760 And I think there are the same number of balls. 1993 01:27:42,760 --> 01:27:44,780 I'm not sure, but somehow-- 1994 01:27:44,780 --> 01:27:46,860 I mean, it's so they equal out. 1995 01:27:46,860 --> 01:27:49,610 And this is just agent based modeling, 1996 01:27:49,610 --> 01:27:54,320 applying these force equations but only between the balls that 1997 01:27:54,320 --> 01:27:56,820 are connected to each other. 1998 01:27:56,820 --> 01:27:59,470 So I mean, it's pretty close to physics and it's pretty cool. 1999 01:27:59,470 --> 01:28:02,870 What was I going to say about agent? 2000 01:28:06,314 --> 01:28:08,674 OK, can someone remind me what we were talking about? 2001 01:28:11,749 --> 01:28:13,790 JUSTIN CURRY: I wanted to comment about this idea 2002 01:28:13,790 --> 01:28:16,341 of modeling the universe. 2003 01:28:16,341 --> 01:28:17,840 If you're really interested, I would 2004 01:28:17,840 --> 01:28:19,298 recommend a book called Programming 2005 01:28:19,298 --> 01:28:21,800 the Universe by Seth Lloyd. 2006 01:28:21,800 --> 01:28:28,090 And essentially, it says, well, the universe, at the bottom, 2007 01:28:28,090 --> 01:28:29,730 uses quantum mechanics. 2008 01:28:29,730 --> 01:28:32,560 We can't model quantum mechanics effectively 2009 01:28:32,560 --> 01:28:34,770 on the classical computer. 2010 01:28:34,770 --> 01:28:37,450 In fact, we need a quantum computer. 2011 01:28:37,450 --> 01:28:40,310 But the universe itself is a quantum computer. 2012 01:28:40,310 --> 01:28:42,565 And the only thing which could model the universe 2013 01:28:42,565 --> 01:28:45,010 is the universe itself. 2014 01:28:45,010 --> 01:28:46,354 So we can't build-- 2015 01:28:46,354 --> 01:28:48,145 I mean, we'd have to build another universe 2016 01:28:48,145 --> 01:28:49,310 to model the other one. 2017 01:28:49,310 --> 01:28:50,684 But why do that when we all ready 2018 01:28:50,684 --> 01:28:53,644 have the universe, which is computing itself? 2019 01:28:53,644 --> 01:28:55,310 So I mean, it's a very interesting book, 2020 01:28:55,310 --> 01:28:58,020 and Seth Lloyd's a very prominent physicist. 2021 01:28:58,020 --> 01:29:00,100 And he's a mechanical engineer here at MIT, 2022 01:29:00,100 --> 01:29:02,670 who works with the Santa Fe Institute of Complexity 2023 01:29:02,670 --> 01:29:04,690 Science. 2024 01:29:04,690 --> 01:29:06,890 It's a very good book. 2025 01:29:06,890 --> 01:29:10,180 But I think it gets kind of at the heart of the problem-- 2026 01:29:10,180 --> 01:29:15,090 that really the tools that we need to describe the universe 2027 01:29:15,090 --> 01:29:17,790 is really the bits of the universe itself. 2028 01:29:17,790 --> 01:29:20,560 AUDIENCE: But don't you have some of those with properties 2029 01:29:20,560 --> 01:29:21,430 of infinite's. 2030 01:29:21,430 --> 01:29:24,080 You know, one like-- 2031 01:29:24,080 --> 01:29:25,790 the whole thing [INAUDIBLE]. 2032 01:29:25,790 --> 01:29:27,540 Even in that thing, there's another thing 2033 01:29:27,540 --> 01:29:29,470 that's also infinite. 2034 01:29:29,470 --> 01:29:31,810 So you can have a quantum mechanic computer 2035 01:29:31,810 --> 01:29:35,500 inside of the universe that's also 2036 01:29:35,500 --> 01:29:36,882 sort of emulating the universe. 2037 01:29:36,882 --> 01:29:38,340 JUSTIN CURRY: Yeah, but it can only 2038 01:29:38,340 --> 01:29:41,460 emulate, pretty much, a part of the universe of equals sides. 2039 01:29:41,460 --> 01:29:41,960 Right? 2040 01:29:41,960 --> 01:29:47,077 A quantum computer involving setting the quantum bits 2041 01:29:47,077 --> 01:29:52,860 can only essentially model something that big, right? 2042 01:29:52,860 --> 01:29:54,750 But the information content of the universe, 2043 01:29:54,750 --> 01:29:57,415 really, is equal to the amount of information 2044 01:29:57,415 --> 01:30:00,144 the universe can compute with. 2045 01:30:00,144 --> 01:30:02,060 So you need a computer as big as the universe, 2046 01:30:02,060 --> 01:30:03,407 so you can model the universe. 2047 01:30:03,407 --> 01:30:04,990 CURRAN KELLEHER: Down to every detail. 2048 01:30:04,990 --> 01:30:05,890 JUSTIN CURRY: Down to every detail. 2049 01:30:05,890 --> 01:30:07,265 CURRAN KELLEHER: So all we can do 2050 01:30:07,265 --> 01:30:08,629 is approximate on higher levels. 2051 01:30:08,629 --> 01:30:11,170 JUSTIN CURRY: And [? do those ?] [? chunking ?] descriptions, 2052 01:30:11,170 --> 01:30:12,378 abstract [INAUDIBLE] details. 2053 01:30:12,378 --> 01:30:14,188 Try to pull out salient features. 2054 01:30:16,940 --> 01:30:19,920 CURRAN KELLEHER: So I remember what I was going to say before. 2055 01:30:19,920 --> 01:30:23,180 So this is an agent based model, where each one of these 2056 01:30:23,180 --> 01:30:24,830 is considered an agent. 2057 01:30:24,830 --> 01:30:26,690 And this emergent behavior happens 2058 01:30:26,690 --> 01:30:31,760 where the structure comes together and just moves around. 2059 01:30:31,760 --> 01:30:34,820 I remember I was going to say, if you think of your brain 2060 01:30:34,820 --> 01:30:38,956 as an agent based complex system, in a sense-- which 2061 01:30:38,956 --> 01:30:39,830 is really what it is. 2062 01:30:39,830 --> 01:30:44,030 Neurons are just interacting with each other and outside 2063 01:30:44,030 --> 01:30:45,800 stimulus. 2064 01:30:45,800 --> 01:30:48,590 So roughly speaking, it could be considered 2065 01:30:48,590 --> 01:30:53,000 this agent based system. 2066 01:30:53,000 --> 01:30:55,670 And the neurons are analogous to the balls here. 2067 01:30:55,670 --> 01:30:57,950 And the formations and the actions 2068 01:30:57,950 --> 01:31:01,910 of what's going on at a higher level, that are emergent, 2069 01:31:01,910 --> 01:31:05,090 is analogous to your thoughts. 2070 01:31:05,090 --> 01:31:07,820 So when your thoughts try to introspect on-- 2071 01:31:07,820 --> 01:31:08,689 well, I don't know. 2072 01:31:08,689 --> 01:31:09,980 I can't really go much further. 2073 01:31:09,980 --> 01:31:11,600 AUDIENCE: Sort of like a [INAUDIBLE] 2074 01:31:11,600 --> 01:31:15,252 you can't get back-- you can't get to the [? other island? ?] 2075 01:31:15,252 --> 01:31:17,460 CURRAN KELLEHER: So I mean, you can't understand each 2076 01:31:17,460 --> 01:31:19,290 of your neurons because you-- 2077 01:31:22,100 --> 01:31:23,120 I don't know. 2078 01:31:23,120 --> 01:31:26,490 This structure can't introspect on what 2079 01:31:26,490 --> 01:31:31,582 it's made out of, because then it wouldn't be itself anymore. 2080 01:31:31,582 --> 01:31:34,050 AUDIENCE: Oh, quantum mechanics, again. 2081 01:31:34,050 --> 01:31:36,632 CURRAN KELLEHER: Yeah, quantum mechanics again. 2082 01:31:36,632 --> 01:31:37,560 I don't know. 2083 01:31:40,810 --> 01:31:48,370 So one more example, which is pretty cool, which exhibits 2084 01:31:48,370 --> 01:31:50,970 optimization. 2085 01:31:50,970 --> 01:31:55,000 I can click and add balls, and they connect to each other. 2086 01:31:55,000 --> 01:31:57,490 And each one of them has this-- 2087 01:31:57,490 --> 01:32:00,940 each pair, every pair, has this sort of optimal distance 2088 01:32:00,940 --> 01:32:03,050 away from one another. 2089 01:32:03,050 --> 01:32:06,640 And it sort of evolves to this optimal distance. 2090 01:32:06,640 --> 01:32:09,930 So nowhere in my program do I say, if you have-- 2091 01:32:09,930 --> 01:32:12,220 three of these form a triangle. 2092 01:32:12,220 --> 01:32:16,510 It's an emergent property due to these various local forces 2093 01:32:16,510 --> 01:32:19,870 trying to achieve local optimal solutions. 2094 01:32:19,870 --> 01:32:22,450 And this happens in chemistry a lot. 2095 01:32:22,450 --> 01:32:24,790 Particles and molecules always trying 2096 01:32:24,790 --> 01:32:28,000 to find the lowest energy state. 2097 01:32:28,000 --> 01:32:32,380 And that's how molecules exist. 2098 01:32:32,380 --> 01:32:33,861 But it's not as simple. 2099 01:32:33,861 --> 01:32:35,110 I mean, so here's another one. 2100 01:32:35,110 --> 01:32:38,410 At four, this is the optimal configuration. 2101 01:32:38,410 --> 01:32:40,190 And it just evolves into it. 2102 01:32:40,190 --> 01:32:41,840 That's how physics works. 2103 01:32:41,840 --> 01:32:45,760 And with five, it makes this star shape. 2104 01:32:45,760 --> 01:32:49,750 But this program is not as simple as I said it is. 2105 01:32:49,750 --> 01:32:51,730 There are some strange rules that 2106 01:32:51,730 --> 01:32:56,170 say when and when not to make edges between them. 2107 01:32:56,170 --> 01:32:59,680 Edges are connections between the balls. 2108 01:32:59,680 --> 01:33:01,940 So I can do something like this. 2109 01:33:01,940 --> 01:33:04,130 I can just drag and make a ton of things. 2110 01:33:07,582 --> 01:33:09,040 I don't remember the rules exactly, 2111 01:33:09,040 --> 01:33:11,540 but I tried to make it so that it was only local. 2112 01:33:11,540 --> 01:33:13,420 And things that are far apart don't 2113 01:33:13,420 --> 01:33:15,160 interact with one another. 2114 01:33:15,160 --> 01:33:16,715 They only go through intermediaries. 2115 01:33:16,715 --> 01:33:18,340 JUSTIN CURRY: Unless it's in resonance, 2116 01:33:18,340 --> 01:33:20,686 like it's forming [? breaking ?] [? laws. ?] 2117 01:33:20,686 --> 01:33:22,810 CURRAN KELLEHER: Yeah, it's like resonant, sort of. 2118 01:33:22,810 --> 01:33:25,080 Resonance. 2119 01:33:25,080 --> 01:33:27,630 I don't remember exactly what resonance is, but it's-- 2120 01:33:27,630 --> 01:33:28,453 JUSTIN CURRY: Resonance is just when 2121 01:33:28,453 --> 01:33:30,661 you're oscillating between kind of two superimposed-- 2122 01:33:30,661 --> 01:33:32,955 two possible states. 2123 01:33:32,955 --> 01:33:35,837 And, I mean, really, that's what's going on here 2124 01:33:35,837 --> 01:33:38,420 is that you have these different possible stable arrangements. 2125 01:33:38,420 --> 01:33:42,132 And there's this equilibrium that 2126 01:33:42,132 --> 01:33:45,572 is confusing back and forth, but doing-- 2127 01:33:45,572 --> 01:33:46,530 CURRAN KELLEHER: Right. 2128 01:33:46,530 --> 01:33:49,140 So there are these two stable arrangements 2129 01:33:49,140 --> 01:33:50,370 that bleed into one another. 2130 01:33:50,370 --> 01:33:52,800 So it just oscillates between them. 2131 01:33:52,800 --> 01:33:57,720 I mean, it's sort of roughly molecules in the world, 2132 01:33:57,720 --> 01:33:58,696 and stuff. 2133 01:34:01,650 --> 01:34:06,260 If I can make it all the way around, I can make a cell-- 2134 01:34:06,260 --> 01:34:07,560 I've done this before-- 2135 01:34:07,560 --> 01:34:12,390 where you have this sort of membrane which forms. 2136 01:34:12,390 --> 01:34:16,244 Which is exactly like what happens in real biology. 2137 01:34:16,244 --> 01:34:17,160 JUSTIN CURRY: Oh, yes. 2138 01:34:17,160 --> 01:34:21,070 CURRAN KELLEHER: There it is, more or less. 2139 01:34:21,070 --> 01:34:24,290 So in biology, you have these hydrophilic and hydrophobic 2140 01:34:24,290 --> 01:34:25,020 kinds of lipids. 2141 01:34:25,020 --> 01:34:27,450 And the hydrophobic ones go on inside, 2142 01:34:27,450 --> 01:34:29,010 and they all pair up with another. 2143 01:34:29,010 --> 01:34:32,550 And the hydrophilic-- wait-- 2144 01:34:32,550 --> 01:34:34,170 hydrophilic ones go on the outside. 2145 01:34:34,170 --> 01:34:36,210 Hydrophilic can interact with water. 2146 01:34:36,210 --> 01:34:37,750 And they're soluble with water. 2147 01:34:37,750 --> 01:34:40,830 The hydrophobic ones, they're a fear, a fear of water. 2148 01:34:40,830 --> 01:34:43,510 So they try to go to one another. 2149 01:34:43,510 --> 01:34:46,230 And so you get this structure. 2150 01:34:46,230 --> 01:34:48,930 I mean, it's not exactly analogous to this, 2151 01:34:48,930 --> 01:34:51,560 but it's sort of close. 2152 01:34:51,560 --> 01:34:57,060 And it just assumes this optimal configuration in our cells. 2153 01:34:57,060 --> 01:34:59,880 If it weren't for this emergent-- this fundamentally 2154 01:34:59,880 --> 01:35:03,300 low level, lower than the level of cells. 2155 01:35:03,300 --> 01:35:05,130 This is membranes. 2156 01:35:05,130 --> 01:35:08,940 This-- without this emergent property of lipids, 2157 01:35:08,940 --> 01:35:10,177 we wouldn't exist. 2158 01:35:10,177 --> 01:35:11,760 It's a fundamental thing that holds us 2159 01:35:11,760 --> 01:35:13,200 together all the time-- 2160 01:35:13,200 --> 01:35:14,860 our cells. 2161 01:35:14,860 --> 01:35:17,550 I mean, us as humans and all biology 2162 01:35:17,550 --> 01:35:19,590 is just built up of these layers and layers 2163 01:35:19,590 --> 01:35:21,430 of emergent properties. 2164 01:35:21,430 --> 01:35:22,800 So I mean, layers-- 2165 01:35:22,800 --> 01:35:27,090 so these are cells and-- no, no, these are lipids. 2166 01:35:27,090 --> 01:35:31,890 And one layer level above that is these stable, spherical 2167 01:35:31,890 --> 01:35:32,460 membranes. 2168 01:35:32,460 --> 01:35:35,010 And then a level above that is the interaction of cells 2169 01:35:35,010 --> 01:35:37,380 with one another to form organs. 2170 01:35:37,380 --> 01:35:40,650 And then a level above that is the interactions of organs 2171 01:35:40,650 --> 01:35:43,470 with one another in the bloodstream. 2172 01:35:43,470 --> 01:35:47,850 And a level above that is the brain interacting 2173 01:35:47,850 --> 01:35:51,640 with that in a positive way. 2174 01:35:51,640 --> 01:35:55,110 So we just built up all these crazy layers everywhere. 2175 01:35:55,110 --> 01:35:58,600 Everywhere in biology, you'll find this sort of thing. 2176 01:35:58,600 --> 01:36:00,170 So it's immersions. 2177 01:36:00,170 --> 01:36:06,400 It's really cool to grok emergence. 2178 01:36:06,400 --> 01:36:07,890 So any thoughts from anybody? 2179 01:36:07,890 --> 01:36:09,480 Questions? 2180 01:36:09,480 --> 01:36:11,038 We're just about out of time. 2181 01:36:14,032 --> 01:36:18,380 AUDIENCE: Why is it that almost every single little thing can 2182 01:36:18,380 --> 01:36:23,990 be put down to the smallest, basic, physical processes? 2183 01:36:23,990 --> 01:36:25,460 The only problem, I think, we have 2184 01:36:25,460 --> 01:36:30,800 is to develop a theory of organization, 2185 01:36:30,800 --> 01:36:35,260 like variants of properties. 2186 01:36:35,260 --> 01:36:40,220 Like, say, even though almost like 1,000 of my brain 2187 01:36:40,220 --> 01:36:43,050 cells have been destroyed in maybe the past day 2188 01:36:43,050 --> 01:36:44,000 or something. 2189 01:36:44,000 --> 01:36:46,640 I've been drinking or something. 2190 01:36:46,640 --> 01:36:49,250 I'm still me. 2191 01:36:49,250 --> 01:36:51,110 What about my brain organization still 2192 01:36:51,110 --> 01:36:53,550 says, OK, this is still Atif? 2193 01:36:53,550 --> 01:36:57,360 He's still the same thing. 2194 01:36:57,360 --> 01:36:59,940 JUSTIN CURRY: I mean, robustness of systems, right? 2195 01:36:59,940 --> 01:37:01,550 CURRAN KELLEHER: Yeah, that's totally 2196 01:37:01,550 --> 01:37:03,110 what you're hinting at-- robustness. 2197 01:37:03,110 --> 01:37:06,200 Robustness means you can change parts 2198 01:37:06,200 --> 01:37:08,540 of it, like for example with a wireless mesh 2199 01:37:08,540 --> 01:37:09,850 network or something. 2200 01:37:09,850 --> 01:37:12,410 It's robust because if you knock out 2201 01:37:12,410 --> 01:37:16,260 a bunch of nodes here and there, it still exists as a network. 2202 01:37:16,260 --> 01:37:18,800 And so what you're saying in your brain, 2203 01:37:18,800 --> 01:37:21,260 even if you get completely wasted, 2204 01:37:21,260 --> 01:37:23,240 you still know that you're yourself. 2205 01:37:23,240 --> 01:37:24,680 So it's robust to these changes. 2206 01:37:24,680 --> 01:37:25,886 AUDIENCE: [INAUDIBLE]. 2207 01:37:25,886 --> 01:37:29,160 Suppose, OK, some brain cells are destroyed. 2208 01:37:29,160 --> 01:37:30,520 Are the new ones-- 2209 01:37:30,520 --> 01:37:33,980 do I [INAUDIBLE] it up to make sure my self-concept is 2210 01:37:33,980 --> 01:37:35,060 to preserve that. 2211 01:37:35,060 --> 01:37:37,640 Between those brain cells being destroyed 2212 01:37:37,640 --> 01:37:42,520 and neurons being made, created, where am I? 2213 01:37:42,520 --> 01:37:45,080 CURRAN KELLEHER: Well, OK, so you're saying-- 2214 01:37:45,080 --> 01:37:49,220 say, some brain cells get destroyed. 2215 01:37:49,220 --> 01:37:56,210 And your concept of yourself is temporarily is dissolved. 2216 01:37:56,210 --> 01:37:59,270 And then new pathways or whatnot are formed, 2217 01:37:59,270 --> 01:38:01,135 that get your sense of self back. 2218 01:38:01,135 --> 01:38:03,985 AUDIENCE: So at some times, I don't even exist anymore. 2219 01:38:03,985 --> 01:38:05,360 CURRAN KELLEHER: This is, I mean, 2220 01:38:05,360 --> 01:38:09,590 hinting at a very important point, that what are you? 2221 01:38:12,990 --> 01:38:14,450 You are an emergent property. 2222 01:38:14,450 --> 01:38:18,200 Your sense of self is not the actual neurons, 2223 01:38:18,200 --> 01:38:20,860 it's this thing which above exists on a higher level. 2224 01:38:20,860 --> 01:38:22,610 AUDIENCE: Yeah, but it has to be supported 2225 01:38:22,610 --> 01:38:24,330 by the neurons in there, you know? 2226 01:38:24,330 --> 01:38:26,884 You have to create that structure of it. 2227 01:38:26,884 --> 01:38:27,800 CURRAN KELLEHER: Yeah. 2228 01:38:27,800 --> 01:38:29,780 It's a supporting structure. 2229 01:38:29,780 --> 01:38:35,330 But, yeah, you're saying-- so if you temporarily dissolve, then 2230 01:38:35,330 --> 01:38:36,280 you just don't exist. 2231 01:38:36,280 --> 01:38:38,363 I mean, I don't think there's anything more to it. 2232 01:38:38,363 --> 01:38:39,590 And you just reappear. 2233 01:38:39,590 --> 01:38:41,810 You just disappear, reappear, and things do that 2234 01:38:41,810 --> 01:38:44,120 all the time. 2235 01:38:44,120 --> 01:38:46,220 AUDIENCE: That could be the case if it was data. 2236 01:38:48,750 --> 01:38:51,450 I could be knocked out, and then I could be rebuilt 2237 01:38:51,450 --> 01:38:53,221 and, OK, I'll be the same. 2238 01:38:53,221 --> 01:38:54,220 But is that [INAUDIBLE]? 2239 01:38:54,220 --> 01:38:57,050 Do I have to have continuity to be really there? 2240 01:38:59,432 --> 01:39:01,390 Even like-- you can't even have much continuity 2241 01:39:01,390 --> 01:39:05,010 because [INAUDIBLE] neurons like [INAUDIBLE].. 2242 01:39:05,010 --> 01:39:09,230 Given the fact that a second continuous thought, 2243 01:39:09,230 --> 01:39:14,686 maybe that's even an illusion that signals up the trouble. 2244 01:39:14,686 --> 01:39:19,482 JUSTIN CURRY: Right, so fundamentally you have-- 2245 01:39:19,482 --> 01:39:21,530 the cells in your body are replaced 2246 01:39:21,530 --> 01:39:23,340 about every seven years. 2247 01:39:23,340 --> 01:39:26,910 So you, seven years ago, are not made of the same cells 2248 01:39:26,910 --> 01:39:28,880 you're made of now, right? 2249 01:39:28,880 --> 01:39:33,650 So there's just this idea that what 2250 01:39:33,650 --> 01:39:35,810 constitutes you has to be stored at a higher 2251 01:39:35,810 --> 01:39:37,970 level of description. 2252 01:39:37,970 --> 01:39:42,650 That you can take entire chunks out at a time 2253 01:39:42,650 --> 01:39:44,180 and still have things OK. 2254 01:39:44,180 --> 01:39:47,150 And what makes us different than a computer, 2255 01:39:47,150 --> 01:39:49,990 is that we can be running us-- 2256 01:39:49,990 --> 01:39:52,250 which is like, we would like to run a computer 2257 01:39:52,250 --> 01:39:54,170 and be able to, I don't know, swap out 2258 01:39:54,170 --> 01:40:00,140 a few transistors and maybe even like a stick of memory, 2259 01:40:00,140 --> 01:40:03,289 and still have everything running smoothly up top. 2260 01:40:03,289 --> 01:40:05,330 Because that's what happens all the time with us. 2261 01:40:05,330 --> 01:40:07,250 AUDIENCE: But given the fact that you can also 2262 01:40:07,250 --> 01:40:10,356 kick out your memory, you're not going to be you. 2263 01:40:10,356 --> 01:40:11,730 JUSTIN CURRY: Yeah, so then there 2264 01:40:11,730 --> 01:40:14,010 does seem to be certain points of view, 2265 01:40:14,010 --> 01:40:15,860 which are really critical. 2266 01:40:15,860 --> 01:40:19,640 But there are certain key nodes, that if you knock those out-- 2267 01:40:19,640 --> 01:40:22,306 and this is really almost a problem in graph theory. 2268 01:40:22,306 --> 01:40:23,930 And this is really something that we've 2269 01:40:23,930 --> 01:40:27,080 used in understanding terrorist networks, 2270 01:40:27,080 --> 01:40:29,360 is like, sure, you can knock out a few lower guys, 2271 01:40:29,360 --> 01:40:31,550 but you're not going to destroy the whole network. 2272 01:40:31,550 --> 01:40:34,190 But if you knock out that guy, is it-- the whole thing-- 2273 01:40:34,190 --> 01:40:35,420 going to come to the ground. 2274 01:40:35,420 --> 01:40:40,250 It's just like if you get a pole to the head or something, 2275 01:40:40,250 --> 01:40:42,910 suddenly you could be done. 2276 01:40:42,910 --> 01:40:45,950 Or if someone cracks you in the leg, you're going to be OK. 2277 01:40:49,310 --> 01:40:51,390 It's almost like a graph problem, fundamentally. 2278 01:40:51,390 --> 01:40:54,056 CURRAN KELLEHER: So a lot of the things that we're talking about 2279 01:40:54,056 --> 01:40:56,200 are problems of complex systems. 2280 01:40:56,200 --> 01:40:59,405 And you hinted at before, you said, well, maybe what 2281 01:40:59,405 --> 01:41:03,880 we should do is try to come up with a general understanding 2282 01:41:03,880 --> 01:41:08,154 of these kinds of systems and theories about them. 2283 01:41:08,154 --> 01:41:09,320 And this is complex systems. 2284 01:41:09,320 --> 01:41:14,230 So this is what New England Complex Systems Institute does, 2285 01:41:14,230 --> 01:41:17,109 or Santa Fe Institute of California. 2286 01:41:17,109 --> 01:41:18,150 JUSTIN CURRY: New Mexico. 2287 01:41:18,150 --> 01:41:20,340 CURRAN KELLEHER: Oh, New Mexico. 2288 01:41:20,340 --> 01:41:22,337 JUSTIN CURRY: Sorry. 2289 01:41:22,337 --> 01:41:24,670 Well, yeah, there's a lot of interesting stuff to go on. 2290 01:41:24,670 --> 01:41:28,740 And whatever interests you, I hope we can plug that interest. 2291 01:41:28,740 --> 01:41:31,770 And please talk with us after class, 2292 01:41:31,770 --> 01:41:36,960 and we can point you in the right direction. 2293 01:41:36,960 --> 01:41:38,650 And we completely open-- 2294 01:41:38,650 --> 01:41:40,754 we hope that's open-- ah. 2295 01:41:40,754 --> 01:41:41,670 You're attached to me. 2296 01:41:41,670 --> 01:41:43,490 CURRAN KELLEHER: I am. 2297 01:41:43,490 --> 01:41:46,490 JUSTIN CURRY: No coulombic forces here. 2298 01:41:46,490 --> 01:41:49,689 But yes, please feel free to come talk to us after class 2299 01:41:49,689 --> 01:41:52,230 and we can try to suit you up with your particular interests. 2300 01:41:52,230 --> 01:41:54,480 And even if you don't know what you're interested in-- 2301 01:41:54,480 --> 01:41:57,000 or you have an idea of, well, that was really boring, 2302 01:41:57,000 --> 01:41:58,890 but this was OK-- 2303 01:41:58,890 --> 01:42:04,380 we can maybe cater to your interests. 2304 01:42:04,380 --> 01:42:06,550 But this was really kind of the last lecture. 2305 01:42:06,550 --> 01:42:08,800 And I wanted to thank all of you guys for coming here, 2306 01:42:08,800 --> 01:42:11,790 for bringing your questions, for bringing your open minds. 2307 01:42:11,790 --> 01:42:15,510 And I think we've had a really good semester so far. 2308 01:42:15,510 --> 01:42:20,100 And I hope you guys enjoy Waking Life in the next lecture. 2309 01:42:20,100 --> 01:42:22,650 But other than that, any last questions 2310 01:42:22,650 --> 01:42:24,780 before I say goodbye and bon voyage? 2311 01:42:27,391 --> 01:42:27,890 Atif. 2312 01:42:27,890 --> 01:42:30,292 AUDIENCE: I've got a paradox that I've been struggling 2313 01:42:30,292 --> 01:42:32,880 for the whole day with. 2314 01:42:32,880 --> 01:42:35,360 So you have zero to one, right? 2315 01:42:35,360 --> 01:42:38,040 And you have zero to two, right? 2316 01:42:38,040 --> 01:42:40,915 Both of them have an infinite amount of elements, 2317 01:42:40,915 --> 01:42:45,990 but it could be proven that as many points from 0-1 as to 0-2. 2318 01:42:45,990 --> 01:42:48,450 So why is two greater than one if there's 2319 01:42:48,450 --> 01:42:51,700 as much stuff between both? 2320 01:42:51,700 --> 01:42:53,740 JUSTIN CURRY: Because we don't use, 2321 01:42:53,740 --> 01:42:58,270 as a well ordering operation, the number of points 2322 01:42:58,270 --> 01:43:00,250 between this and that. 2323 01:43:00,250 --> 01:43:04,370 We just start off with here are the integers. 2324 01:43:04,370 --> 01:43:06,220 Here's a well ordering on them. 2325 01:43:06,220 --> 01:43:08,892 And that's how we go. 2326 01:43:08,892 --> 01:43:10,475 AUDIENCE: So you just put distraction. 2327 01:43:10,475 --> 01:43:13,440 It doesn't matter how much [INAUDIBLE].. 2328 01:43:13,440 --> 01:43:15,874 JUSTIN CURRY: Right, exactly. 2329 01:43:15,874 --> 01:43:17,290 I mean, it's an uncountable amount 2330 01:43:17,290 --> 01:43:18,539 of stuff in the unit interval. 2331 01:43:18,539 --> 01:43:20,620 And there's also an uncountable-- the same 2332 01:43:20,620 --> 01:43:24,107 uncountable amount of stuff in the entire real line. 2333 01:43:24,107 --> 01:43:25,690 These are the paradoxes of infinities, 2334 01:43:25,690 --> 01:43:27,460 but they're not really paradoxes. 2335 01:43:27,460 --> 01:43:29,626 Because we're not used to thinking about infinities. 2336 01:43:29,626 --> 01:43:31,419 AUDIENCE: Maybe it's just for convenience. 2337 01:43:31,419 --> 01:43:33,460 JUSTIN CURRY: So maybe it's just for convenience? 2338 01:43:33,460 --> 01:43:35,230 Well, I mean, partly. 2339 01:43:35,230 --> 01:43:37,240 But the other thing is mathematicians 2340 01:43:37,240 --> 01:43:40,870 are pretty retentive creatures. 2341 01:43:40,870 --> 01:43:44,495 And we make these not just for convention, 2342 01:43:44,495 --> 01:43:46,120 but we hope they're rigorously defined. 2343 01:43:46,120 --> 01:43:50,170 And you can actually create your real numbers just out 2344 01:43:50,170 --> 01:43:55,510 of irrational numbers by using this process of completion. 2345 01:43:55,510 --> 01:43:58,810 And there, I mean, you have to go do an undergraduate level 2346 01:43:58,810 --> 01:44:00,130 course in analysis, really. 2347 01:44:03,655 --> 01:44:09,032 AUDIENCE: [INAUDIBLE] do a square on one 2348 01:44:09,032 --> 01:44:11,300 of them [INAUDIBLE],, square on another. 2349 01:44:11,300 --> 01:44:14,860 And so this has a bigger [INAUDIBLE] than the other one. 2350 01:44:14,860 --> 01:44:16,360 JUSTIN CURRY: Yeah, but you're still 2351 01:44:16,360 --> 01:44:18,220 kind of getting away at the idea. 2352 01:44:18,220 --> 01:44:23,830 Because just as you said, there are just as many points in R2, 2353 01:44:23,830 --> 01:44:24,560 right? 2354 01:44:24,560 --> 01:44:27,010 The two dimensional plane that started on the line. 2355 01:44:27,010 --> 01:44:30,340 You could have space filling curves that visit everything, 2356 01:44:30,340 --> 01:44:32,740 which can be put into one [INAUDIBLE] correspondence 2357 01:44:32,740 --> 01:44:34,570 with just the real line. 2358 01:44:34,570 --> 01:44:37,060 It's the same cardinality. 2359 01:44:37,060 --> 01:44:39,720 AUDIENCE: So then why is two greater than one? 2360 01:44:39,720 --> 01:44:41,470 JUSTIN CURRY: Why is two greater than one? 2361 01:44:41,470 --> 01:44:44,070 It's because we started out that way, basically. 2362 01:44:44,070 --> 01:44:47,427 AUDIENCE: But then we're getting something that's conflicting. 2363 01:44:47,427 --> 01:44:49,010 JUSTIN CURRY: No it's not conflicting. 2364 01:44:49,010 --> 01:44:52,289 It's just-- you can still-- 2365 01:44:52,289 --> 01:44:54,080 if you give me a number and another number, 2366 01:44:54,080 --> 01:44:56,320 I can tell you which ones is bigger than the other. 2367 01:44:56,320 --> 01:44:57,194 That's not a problem. 2368 01:44:59,912 --> 01:45:01,370 AUDIENCE: How do you define bigger? 2369 01:45:01,370 --> 01:45:04,384 The amount of stuff that's in it, or like the-- 2370 01:45:04,384 --> 01:45:05,800 JUSTIN CURRY: What does that mean? 2371 01:45:05,800 --> 01:45:07,930 What does the amount of stuff in it mean? 2372 01:45:07,930 --> 01:45:10,950 AUDIENCE: OK, like there's as many points between zero 2373 01:45:10,950 --> 01:45:13,729 and one and zero and two. 2374 01:45:13,729 --> 01:45:15,770 JUSTIN CURRY: All right, well, see that's not a-- 2375 01:45:15,770 --> 01:45:16,270 yes. 2376 01:45:16,270 --> 01:45:18,520 I mean that's a true statement, but it's also true 2377 01:45:18,520 --> 01:45:21,160 that there is as many odd numbers 2378 01:45:21,160 --> 01:45:22,810 as there are natural numbers. 2379 01:45:22,810 --> 01:45:25,918 There's as many even numbers as there are integers. 2380 01:45:25,918 --> 01:45:27,918 CURRAN KELLEHER: But it's not indefinite number. 2381 01:45:27,918 --> 01:45:29,147 It's an infinity. 2382 01:45:29,147 --> 01:45:29,980 JUSTIN CURRY: Right. 2383 01:45:29,980 --> 01:45:31,771 Just as you've said, it can be put into one 2384 01:45:31,771 --> 01:45:33,780 to one correspondence with a subset of itself. 2385 01:45:33,780 --> 01:45:37,760 CURRAN KELLEHER: And like that, [? a picture. ?] 2386 01:45:37,760 --> 01:45:39,960 AUDIENCE: But then, I just don't know 2387 01:45:39,960 --> 01:45:43,610 how you can get from the things in [INAUDIBLE] then you saying, 2388 01:45:43,610 --> 01:45:47,959 OK, this is two greater than that one. 2389 01:45:47,959 --> 01:45:49,000 JUSTIN CURRY: Well, yeah. 2390 01:45:49,000 --> 01:45:50,540 You just don't use that as your metric 2391 01:45:50,540 --> 01:45:52,498 for counting number of points, because it's not 2392 01:45:52,498 --> 01:45:53,810 a well defined idea. 2393 01:45:53,810 --> 01:45:56,920 I mean, it's well defined, but it doesn't give you any info. 2394 01:45:56,920 --> 01:45:58,700 What it would tell you is that the reals 2395 01:45:58,700 --> 01:46:01,820 are bigger than the integers. 2396 01:46:01,820 --> 01:46:03,480 But that's not-- you want to know 2397 01:46:03,480 --> 01:46:04,646 that two is bigger than one. 2398 01:46:09,400 --> 01:46:11,242 AUDIENCE: So you just want this number one 2399 01:46:11,242 --> 01:46:16,707 to pull a metric that's almost the same as the integers on it? 2400 01:46:16,707 --> 01:46:18,290 JUSTIN CURRY: I'm not sure, but if you 2401 01:46:18,290 --> 01:46:20,123 want to know more about metrics, and things, 2402 01:46:20,123 --> 01:46:22,307 we should talk about it after class. 2403 01:46:22,307 --> 01:46:24,390 Just real quickly, can I field any other questions 2404 01:46:24,390 --> 01:46:27,660 that people have burning deep down inside of them? 2405 01:46:30,930 --> 01:46:32,390 All right, well then-- 2406 01:46:32,390 --> 01:46:33,310 yes. 2407 01:46:33,310 --> 01:46:36,640 Let's call it a wrap and then feel free to stick around 2408 01:46:36,640 --> 01:46:37,690 and hang out after class. 2409 01:46:37,690 --> 01:46:39,939 AUDIENCE: If [INAUDIBLE] feel like explaining the self 2410 01:46:39,939 --> 01:46:42,688 as almost this data thing, how are you 2411 01:46:42,688 --> 01:46:45,716 going to explain the observer of that data? 2412 01:46:45,716 --> 01:46:51,030 CURRAN KELLEHER: Yeah, it's a tangled hierarchy. 2413 01:46:51,030 --> 01:46:53,400 AUDIENCE: And even then, you have 2414 01:46:53,400 --> 01:46:55,820 the observer, where the observer just observed being also 2415 01:46:55,820 --> 01:46:57,381 [INAUDIBLE]. 2416 01:46:57,381 --> 01:46:59,672 CURRAN KELLEHER: They're not different from each other. 2417 01:46:59,672 --> 01:47:01,302 AUDIENCE: They're different or they're the same? 2418 01:47:01,302 --> 01:47:01,768 CURRAN KELLEHER: They're the same. 2419 01:47:01,768 --> 01:47:03,054 They're not different. 2420 01:47:03,054 --> 01:47:04,470 AUDIENCE: So then, OK, you can say 2421 01:47:04,470 --> 01:47:06,300 [? the dead ?] is observing itself, 2422 01:47:06,300 --> 01:47:08,581 was recording things about itself. 2423 01:47:08,581 --> 01:47:10,955 JUSTIN CURRY: Because, I mean, even in quantum mechanics, 2424 01:47:10,955 --> 01:47:12,740 we need an observer. 2425 01:47:12,740 --> 01:47:15,420 The observer just really can be a measurement 2426 01:47:15,420 --> 01:47:17,725 that just happens by the interaction of two particles. 2427 01:47:17,725 --> 01:47:21,366 So particles themselves can be observers themselves. 2428 01:47:21,366 --> 01:47:22,990 CURRAN KELLEHER: [INAUDIBLE] mechanics. 2429 01:47:22,990 --> 01:47:24,190 JUSTIN CURRY: Did I say that? 2430 01:47:24,190 --> 01:47:24,510 CURRAN KELLEHER: No. 2431 01:47:24,510 --> 01:47:25,426 JUSTIN CURRY: Quantum. 2432 01:47:25,426 --> 01:47:28,160 CURRAN KELLEHER: I was just thinking of quantum mechanics.