1 00:00:01,501 --> 00:00:03,870 The following content is provided under a Creative 2 00:00:03,870 --> 00:00:05,238 Commons license. 3 00:00:05,238 --> 00:00:07,474 Your support will help MIT OpenCourseWare 4 00:00:07,474 --> 00:00:11,544 continue to offer high-quality educational resources for free. 5 00:00:11,544 --> 00:00:14,080 To make a donation, or to view additional materials 6 00:00:14,080 --> 00:00:18,051 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:18,051 --> 00:00:23,709 at ocw.mit.edu. 8 00:00:23,709 --> 00:00:25,625 MICHEL DEGRAFF: Talking about students at MIT. 9 00:00:25,625 --> 00:00:27,360 So some of you are graduating soon, 10 00:00:27,360 --> 00:00:28,795 and this might be your last chance 11 00:00:28,795 --> 00:00:31,297 to ask Noam Chomsky a question. 12 00:00:31,297 --> 00:00:33,199 So anyone who hasn't spoken yet would 13 00:00:33,199 --> 00:00:40,893 like to ask a question before you graduate? 14 00:00:40,893 --> 00:00:43,309 AUDIENCE: We've talked in this class a bit about silencing 15 00:00:43,309 --> 00:00:47,313 and how that kind of leads into the whole system. 16 00:00:47,313 --> 00:00:50,950 Sorry, I can speak up. 17 00:00:50,950 --> 00:00:52,685 So we've talked about silencing. 18 00:00:52,685 --> 00:00:53,520 NOAM CHOMSKY: About? 19 00:00:53,520 --> 00:00:54,220 MICHEL DEGRAFF: Silencing. 20 00:00:54,220 --> 00:00:55,220 NOAM CHOMSKY: Silencing. 21 00:00:55,220 --> 00:00:56,423 AUDIENCE: Yeah. 22 00:00:56,423 --> 00:01:03,546 And how stories are hidden, like you talked about-- what can we 23 00:01:03,546 --> 00:01:04,650 do about that? 24 00:01:04,650 --> 00:01:05,899 NOAM CHOMSKY: About silencing? 25 00:01:05,899 --> 00:01:07,482 AUDIENCE: What measurable steps can we 26 00:01:07,482 --> 00:01:09,836 take towards uncovering the stories 27 00:01:09,836 --> 00:01:15,041 and actually making change happen? 28 00:01:15,041 --> 00:01:19,312 NOAM CHOMSKY: Well, what you do about silencing is speak up. 29 00:01:19,312 --> 00:01:22,615 We have a lot of freedom if we use it. 30 00:01:22,615 --> 00:01:27,720 In fact, a lot more than in the past. 31 00:01:27,720 --> 00:01:29,789 Take the last thing I mentioned. 32 00:01:29,789 --> 00:01:34,894 For women, the opportunity is to speak up and become active 33 00:01:34,894 --> 00:01:40,667 and have an impact is way beyond what it was, say, 50 years ago. 34 00:01:40,667 --> 00:01:46,806 In fact, I was talking to Michel about this before. 35 00:01:46,806 --> 00:01:48,975 I got a ton of email. 36 00:01:48,975 --> 00:01:51,811 And a lot of it is from young people. 37 00:01:51,811 --> 00:01:54,848 And one question that keeps coming up over and over 38 00:01:54,848 --> 00:01:56,916 again is, everything's awful. 39 00:01:56,916 --> 00:01:58,585 What can we do? 40 00:01:58,585 --> 00:02:03,056 But if you look closely over many years, 41 00:02:03,056 --> 00:02:06,226 I've noticed that the kinds of questions you get 42 00:02:06,226 --> 00:02:09,662 depend on where the people are coming from. 43 00:02:09,662 --> 00:02:12,665 So if people are coming from-- say 44 00:02:12,665 --> 00:02:17,971 when I go down to give talks in immigrant slums in South 45 00:02:17,971 --> 00:02:21,207 Boston, nobody ever asks what they can do. 46 00:02:21,207 --> 00:02:23,409 They tell me what they're doing. 47 00:02:23,409 --> 00:02:28,414 When I go to a remote village in Southern Colombia 48 00:02:28,414 --> 00:02:32,184 where people are being murdered by paramilitaries and there's 49 00:02:32,184 --> 00:02:36,656 a chemical company trying to-- a gold mining company trying 50 00:02:36,656 --> 00:02:38,625 to destroy the water supply. 51 00:02:38,625 --> 00:02:40,426 They don't ask me, what should they do? 52 00:02:40,426 --> 00:02:42,762 They tell me what they're doing. 53 00:02:42,762 --> 00:02:45,231 When people are privileged and have 54 00:02:45,231 --> 00:02:50,370 every opportunity in front of them, they ask, what can we do? 55 00:02:50,370 --> 00:02:53,540 And the fact is almost anything. 56 00:02:53,540 --> 00:02:56,242 There's all sorts of opportunities open. 57 00:02:56,242 --> 00:02:58,278 Way more than there were in the past 58 00:02:58,278 --> 00:03:03,283 because we do enjoy the legacy of people who have struggled 59 00:03:03,283 --> 00:03:06,286 in the past under much harsher conditions 60 00:03:06,286 --> 00:03:11,324 and have provided us with the opportunities we now have. 61 00:03:11,324 --> 00:03:16,080 Like, say, the opportunities to be a student at MIT. 62 00:03:16,080 --> 00:03:16,996 That's an opportunity. 63 00:03:16,996 --> 00:03:18,965 It gives you all sorts of things. 64 00:03:18,965 --> 00:03:22,368 It was closed to half the population 50 years ago. 65 00:03:22,368 --> 00:03:28,107 It was closed, of course, to minorities almost entirely. 66 00:03:28,107 --> 00:03:36,082 Well, it just illustrates what's gone on all over the society. 67 00:03:36,082 --> 00:03:40,520 I mean, take, say, the Sanders campaign, 68 00:03:40,520 --> 00:03:45,858 which I think was the most important aspect of the 2016 69 00:03:45,858 --> 00:03:49,829 election, by far. 70 00:03:49,829 --> 00:03:52,966 Out of that, there are groups developing that are really 71 00:03:52,966 --> 00:03:54,466 trying to do things. 72 00:03:54,466 --> 00:03:56,869 You can be part of that. 73 00:03:56,869 --> 00:03:59,405 Make new ones. 74 00:03:59,405 --> 00:04:01,941 All sorts of opportunities. 75 00:04:01,941 --> 00:04:05,044 And I think the prospects for the future 76 00:04:05,044 --> 00:04:10,049 are not too bad when you think of people's actual attitudes. 77 00:04:10,049 --> 00:04:12,218 And the way those attitudes could 78 00:04:12,218 --> 00:04:17,624 be melded into activist and political and other programs 79 00:04:17,624 --> 00:04:20,627 to bring about changes in society. 80 00:04:20,627 --> 00:04:27,834 So I don't think there's a real shortage of opportunity. 81 00:04:27,834 --> 00:04:32,472 I mean, there are all sorts of efforts to atomize people, 82 00:04:32,472 --> 00:04:34,040 keep you alone. 83 00:04:34,040 --> 00:04:37,343 That's one of the functions of social media incidentally. 84 00:04:37,343 --> 00:04:41,271 You sit alone with-- whatever it is-- your device 85 00:04:41,271 --> 00:04:46,252 and you think you have friends in Indonesia and so on. 86 00:04:46,252 --> 00:04:51,924 But you're really separated from the people around you. 87 00:04:51,924 --> 00:04:56,262 Atomization is a very important way of controlling people, 88 00:04:56,262 --> 00:04:59,499 because you can only do things if you work together. 89 00:04:59,499 --> 00:05:03,403 So when you're all separated, that's great. 90 00:05:03,403 --> 00:05:07,607 Then, those who really run things, they do get together. 91 00:05:07,607 --> 00:05:13,079 They don't sit around looking at iPads, and so on. 92 00:05:13,079 --> 00:05:15,214 So you can keep the population-- 93 00:05:15,214 --> 00:05:19,185 In fact, that's one of the main things consumerism is about. 94 00:05:19,185 --> 00:05:22,455 You take a look at the history of the advertising industry. 95 00:05:22,455 --> 00:05:24,457 It's quite interesting. 96 00:05:24,457 --> 00:05:27,960 The huge advertising and public relations industry 97 00:05:27,960 --> 00:05:31,064 developed in the freest countries in the world, 98 00:05:31,064 --> 00:05:34,600 in Britain and the United States about a century ago. 99 00:05:34,600 --> 00:05:36,569 And the reason was pretty clear. 100 00:05:36,569 --> 00:05:38,504 It was often stated. 101 00:05:38,504 --> 00:05:41,307 People had won enough freedom, so that you 102 00:05:41,307 --> 00:05:43,976 couldn't control them by force. 103 00:05:43,976 --> 00:05:46,412 So you had to control them in other ways. 104 00:05:46,412 --> 00:05:48,281 And the best way of controlling them 105 00:05:48,281 --> 00:05:53,953 is what was called convincing them 106 00:05:53,953 --> 00:05:58,124 to be concerned with the superficial things of life, 107 00:05:58,124 --> 00:06:00,993 like consumption. 108 00:06:00,993 --> 00:06:06,432 So if you can get to a stage where-- 109 00:06:06,432 --> 00:06:08,401 thinking of my granddaughter. 110 00:06:08,401 --> 00:06:12,238 Teenage girls on a Saturday afternoon. 111 00:06:12,238 --> 00:06:15,308 The best thing they can think of doing is walking through a mall 112 00:06:15,308 --> 00:06:20,012 and looking at things they can't buy. 113 00:06:20,012 --> 00:06:21,981 That's great. 114 00:06:21,981 --> 00:06:24,751 When you get to a society like that, 115 00:06:24,751 --> 00:06:27,353 you've got people under control. 116 00:06:27,353 --> 00:06:30,289 And there's some pretty obvious things 117 00:06:30,289 --> 00:06:34,360 that are never discussed, like you've all 118 00:06:34,360 --> 00:06:37,897 heard 10 million times about the wonders of free market 119 00:06:37,897 --> 00:06:39,565 societies. 120 00:06:39,565 --> 00:06:43,436 But almost no one tells you that the business world 121 00:06:43,436 --> 00:06:46,172 hates free market society. 122 00:06:46,172 --> 00:06:49,442 And in fact, they spend hundreds of billion dollars a year 123 00:06:49,442 --> 00:06:51,644 to undermine markets. 124 00:06:51,644 --> 00:06:54,347 It's called advertising. 125 00:06:54,347 --> 00:06:57,550 Anyone who has studied economics knows 126 00:06:57,550 --> 00:07:00,479 that markets-- the marvels of markets 127 00:07:00,479 --> 00:07:05,091 are based on informed consumers making rational choices. 128 00:07:05,091 --> 00:07:08,094 Then, you study the mathematics and you make up the models, 129 00:07:08,094 --> 00:07:08,995 and so on. 130 00:07:08,995 --> 00:07:11,097 Take a look at the real world. 131 00:07:11,097 --> 00:07:15,268 Hundreds of billions of dollars are spent every year 132 00:07:15,268 --> 00:07:20,573 to ensure that uninformed people make irrational choices. 133 00:07:20,573 --> 00:07:23,042 It's called advertising. 134 00:07:23,042 --> 00:07:25,445 If you had a market society. 135 00:07:25,445 --> 00:07:30,016 If, say, Ford Motor Company has cars to sell, 136 00:07:30,016 --> 00:07:32,652 what they would do is say, here are the characteristics 137 00:07:32,652 --> 00:07:34,587 of our cars. 138 00:07:34,587 --> 00:07:37,056 Small ad on television. 139 00:07:37,056 --> 00:07:38,224 Here's the characteristics. 140 00:07:38,224 --> 00:07:41,327 Here's what Consumer Reports says about it. 141 00:07:41,327 --> 00:07:42,795 That's not what they do. 142 00:07:42,795 --> 00:07:45,465 What they do is try to create illusions, 143 00:07:45,465 --> 00:07:50,102 so you will be uninformed and make an irrational choice. 144 00:07:50,102 --> 00:07:52,205 In fact, while the economics department 145 00:07:52,205 --> 00:07:55,174 is talking about rational choice models 146 00:07:55,174 --> 00:07:57,443 and so on, with informed consumers, 147 00:07:57,443 --> 00:08:01,314 the business world is trying to make sure it doesn't happen. 148 00:08:01,314 --> 00:08:04,417 Of course, that's the thing that has the effect. 149 00:08:04,417 --> 00:08:07,086 Speaking of silencing, how much do you 150 00:08:07,086 --> 00:08:10,256 hear about this in economics courses? 151 00:08:10,256 --> 00:08:13,926 Or in the general discussion? 152 00:08:13,926 --> 00:08:14,560 It's not deep. 153 00:08:14,560 --> 00:08:15,394 It's not profound. 154 00:08:15,394 --> 00:08:16,696 It's not quantum physics. 155 00:08:16,696 --> 00:08:17,997 We all know it. 156 00:08:17,997 --> 00:08:21,200 It's right in front of our eyes, but it's kept silent. 157 00:08:21,200 --> 00:08:23,870 And those are the kind of things you have to break through with. 158 00:08:23,870 --> 00:08:25,538 MICHEL DEGRAFF: Thank you so much, Noam. 159 00:08:25,538 --> 00:08:28,022 That was wonderful. [applause]