1 00:00:01,501 --> 00:00:03,870 The following content is provided under a Creative 2 00:00:03,870 --> 00:00:05,238 Commons license. 3 00:00:05,238 --> 00:00:07,474 Your support will help MIT OpenCourseWare 4 00:00:07,474 --> 00:00:11,544 continue to offer high-quality educational resources for free. 5 00:00:11,544 --> 00:00:14,080 To make a donation, or to view additional materials 6 00:00:14,080 --> 00:00:18,051 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:18,051 --> 00:00:22,222 at ocw.mit.edu. 8 00:00:22,222 --> 00:00:24,524 MICHEL DEGRAFF: I don't want to make anyone jealous who 9 00:00:24,524 --> 00:00:28,428 came before, but I think we can say pretty safely 10 00:00:28,428 --> 00:00:32,866 that we saved one of the best for last, right? 11 00:00:32,866 --> 00:00:34,401 I won't say the best. 12 00:00:34,401 --> 00:00:34,901 OK? 13 00:00:34,901 --> 00:00:37,137 All right. 14 00:00:37,137 --> 00:00:38,905 So as you can see, throughout the course 15 00:00:38,905 --> 00:00:45,412 we looked at these issues about how language, race, ethnicity, 16 00:00:45,412 --> 00:00:49,949 gender, how they can be used to create various hierarchies. 17 00:00:49,949 --> 00:00:54,654 And as you can see now after the semester, 18 00:00:54,654 --> 00:00:55,889 these are tools, really. 19 00:00:55,889 --> 00:00:58,558 They're not inherently part of who we are, 20 00:00:58,558 --> 00:01:01,928 but they are tools that various pools of power 21 00:01:01,928 --> 00:01:04,697 use to keep control, to create hierarchies. 22 00:01:04,697 --> 00:01:07,467 And I think last week, you guys you had a very good discussion 23 00:01:07,467 --> 00:01:11,438 with Dr. Aleman about how sometimes these hierarchies can 24 00:01:11,438 --> 00:01:12,772 be internalized. 25 00:01:12,772 --> 00:01:14,674 And there was a very good discussion, 26 00:01:14,674 --> 00:01:19,646 as I can tell, from the video that she took for me about how 27 00:01:19,646 --> 00:01:22,582 you, yourself, we need to do some work inside of us 28 00:01:22,582 --> 00:01:25,718 to go beyond these threats that these terror attacks impose 29 00:01:25,718 --> 00:01:26,453 on us. 30 00:01:26,453 --> 00:01:28,088 So we went very micro. 31 00:01:28,088 --> 00:01:30,423 So we went very macro, then went micro. 32 00:01:30,423 --> 00:01:35,195 And today, with my friend and colleague, Noam Chomsky, 33 00:01:35,195 --> 00:01:37,130 we're going to go my macro again. 34 00:01:37,130 --> 00:01:40,166 But I guess you've read the papers. 35 00:01:40,166 --> 00:01:41,533 I read the papers. 36 00:01:41,533 --> 00:01:46,806 And I must say, Noam, I was very disturbed by what I read. 37 00:01:46,806 --> 00:01:49,876 But the fact is that those are the data. 38 00:01:49,876 --> 00:01:51,678 So this is MIT. 39 00:01:51,678 --> 00:01:54,781 We are very much involved in trying to understand knowledge. 40 00:01:54,781 --> 00:01:57,050 And many of you throughout the semester 41 00:01:57,050 --> 00:02:00,653 made a case that we have all this knowledge here at MIT. 42 00:02:00,653 --> 00:02:03,857 But often, this knowledge does not translate into action 43 00:02:03,857 --> 00:02:05,091 into the real world. 44 00:02:05,091 --> 00:02:08,862 And in a way, I think with Noam's entire life, 45 00:02:08,862 --> 00:02:11,030 we have an example of how that can happen. 46 00:02:11,030 --> 00:02:13,766 How we can take very abstract, technical knowledge 47 00:02:13,766 --> 00:02:15,268 and make it available to the world 48 00:02:15,268 --> 00:02:16,402 and try to make it better. 49 00:02:16,402 --> 00:02:18,138 And the theme of this course throughout 50 00:02:18,138 --> 00:02:22,008 was how to build bridges, how to make change and build bridges. 51 00:02:22,008 --> 00:02:25,612 So we hope that with Noam today, we'll 52 00:02:25,612 --> 00:02:27,947 get a clear sense of how that can happen. 53 00:02:27,947 --> 00:02:30,716 And perhaps, how we might, with some luck, 54 00:02:30,716 --> 00:02:32,986 save the world, right? 55 00:02:32,986 --> 00:02:35,298 NOAM CHOMSKY: I think the best way 56 00:02:35,298 --> 00:02:39,577 to proceed is-- in a court seminar is for you to say 57 00:02:39,577 --> 00:02:41,094 what you're interested in. 58 00:02:41,094 --> 00:02:44,831 I'll see if I have some way of reacting to it. 59 00:02:44,831 --> 00:02:47,834 Lots of things I could talk about. 60 00:02:47,834 --> 00:02:50,637 I mean, one thing we could talk about related 61 00:02:50,637 --> 00:02:53,406 is right in the headlines. 62 00:02:53,406 --> 00:02:59,012 So there are things which is an opening for things we can do. 63 00:02:59,012 --> 00:03:03,416 That's the plan that was just made public yesterday 64 00:03:03,416 --> 00:03:13,693 to deport Haitians back to Haiti from Boston. 65 00:03:13,693 --> 00:03:15,495 That's a very live issue. 66 00:03:15,495 --> 00:03:17,463 A lot of lives depend on it. 67 00:03:17,463 --> 00:03:20,166 We can do something about it right here if we want. 68 00:03:20,166 --> 00:03:22,001 It turns out-- I didn't know this-- 69 00:03:22,001 --> 00:03:26,005 that the Haitian Ambassador, Michel just 70 00:03:26,005 --> 00:03:29,841 told me, who is pleading with the government 71 00:03:29,841 --> 00:03:34,079 not to carry out this onerous and destructive act, 72 00:03:34,079 --> 00:03:38,084 is actually a former MIT student, who 73 00:03:38,084 --> 00:03:42,755 wrote about these topics in his thesis. 74 00:03:42,755 --> 00:03:46,978 So that's one of many things that could be discussed.