1 00:00:00,040 --> 00:00:02,460 The following content is provided under a Creative 2 00:00:02,460 --> 00:00:03,970 Commons license. 3 00:00:03,970 --> 00:00:06,910 Your support will help MIT OpenCourseWare continue to 4 00:00:06,910 --> 00:00:10,660 offer high quality educational resources for free. 5 00:00:10,660 --> 00:00:13,460 To make a donation or view additional materials from 6 00:00:13,460 --> 00:00:17,390 hundreds of MIT courses, visit MIT OpenCourseWare at 7 00:00:17,390 --> 00:00:18,640 ocw.mit.edu. 8 00:00:27,940 --> 00:00:31,980 PROFESSOR: So I want to do two things today. 9 00:00:31,980 --> 00:00:34,980 One is wrap up what I was doing last time. 10 00:00:34,980 --> 00:00:39,990 And the rest of the time, or a part of the rest of the time, 11 00:00:39,990 --> 00:00:46,140 I want to spend kind of going back to the questions of what 12 00:00:46,140 --> 00:00:48,960 have we learned, or what do we think we have learned? 13 00:00:53,820 --> 00:00:57,320 So this is mostly continuation. 14 00:00:57,320 --> 00:01:01,960 So let me just remind you what I'm continuing. 15 00:01:01,960 --> 00:01:05,910 The basic point I was trying to make last time is that 16 00:01:05,910 --> 00:01:15,390 there are always many things you can do to make even bad 17 00:01:15,390 --> 00:01:22,670 systems allow for relatively wide range of possible 18 00:01:22,670 --> 00:01:23,820 interventions. 19 00:01:23,820 --> 00:01:27,220 And in reverse, even in good systems, if you have bad 20 00:01:27,220 --> 00:01:29,430 interventions, they're bad. 21 00:01:29,430 --> 00:01:33,840 The fact that the system is good has no guarantee that the 22 00:01:33,840 --> 00:01:37,690 interventions that get chosen are bad. 23 00:01:40,676 --> 00:01:44,690 If you like, in that sequence of arguments, this is the last 24 00:01:44,690 --> 00:01:52,780 one, which is can you actually turn bad politics-- 25 00:01:55,850 --> 00:01:59,240 is it really prior to policy? 26 00:01:59,240 --> 00:02:03,280 So does it exist outside politics? 27 00:02:03,280 --> 00:02:06,160 And we sort of made the case that that's not entirely true 28 00:02:06,160 --> 00:02:12,120 already because we've been saying things like if people 29 00:02:12,120 --> 00:02:16,070 have some idea of who they should vote for then they vote 30 00:02:16,070 --> 00:02:17,760 more discriminatively. 31 00:02:17,760 --> 00:02:21,220 Part of the reason why they don't necessarily use a lot of 32 00:02:21,220 --> 00:02:23,600 discrimination is because they're not necessarily that 33 00:02:23,600 --> 00:02:26,930 interested in the election. 34 00:02:26,930 --> 00:02:29,370 And that's partly because they don't have much information. 35 00:02:29,370 --> 00:02:37,320 Here's a nice experiment which sort of emphasizes that point 36 00:02:37,320 --> 00:02:41,380 on how policies can mold politics. 37 00:02:41,380 --> 00:02:49,680 This has to do with attitudes toward women leaders in India. 38 00:02:49,680 --> 00:02:52,550 In India there's a rule which says that panchayats are the 39 00:02:52,550 --> 00:02:55,560 lowest level of government in India, these are village 40 00:02:55,560 --> 00:02:57,160 governments essentially. 41 00:02:57,160 --> 00:03:09,690 And India has a rule which says that 1/3 of all seats of 42 00:03:09,690 --> 00:03:13,220 leader of panchayat, the role of the boss of the panchayat, 43 00:03:13,220 --> 00:03:15,920 have to be filled by women. 44 00:03:15,920 --> 00:03:21,770 And it turns out that partly for political reasons they 45 00:03:21,770 --> 00:03:25,770 decided that which of them will be reserved for women 46 00:03:25,770 --> 00:03:29,040 will be decided by a lottery. 47 00:03:29,040 --> 00:03:38,400 So there was a lottery to decide out of the 3,000 or how 48 00:03:38,400 --> 00:03:44,530 many thousand villages in this particular state, how many of 49 00:03:44,530 --> 00:03:50,330 them will have a woman lead at this time. 50 00:03:50,330 --> 00:03:55,140 And then in principle the idea is that next time 51 00:03:55,140 --> 00:03:56,360 there will a rotation. 52 00:03:56,360 --> 00:03:59,750 And so the places that are reserved this time will not be 53 00:03:59,750 --> 00:04:02,160 reserved the next time and vice versa. 54 00:04:02,160 --> 00:04:04,470 And that didn't actually happen entirely. 55 00:04:04,470 --> 00:04:06,660 And I'll tell you something about that in a minute. 56 00:04:06,660 --> 00:04:09,970 But right now think of it as a lottery and 57 00:04:09,970 --> 00:04:11,550 then we see what happens. 58 00:04:11,550 --> 00:04:14,910 So the advantage of it being a lottery is that the government 59 00:04:14,910 --> 00:04:17,760 in a sense performed an experiment for us because they 60 00:04:17,760 --> 00:04:20,220 chose these places by a lottery. 61 00:04:20,220 --> 00:04:21,519 So we kind of know that they're 62 00:04:21,519 --> 00:04:22,860 identical to start with. 63 00:04:22,860 --> 00:04:25,700 So there was a bunch of places which had to 64 00:04:25,700 --> 00:04:26,890 have a woman leader. 65 00:04:26,890 --> 00:04:29,210 These places were not a priori any different 66 00:04:29,210 --> 00:04:30,690 from any other places. 67 00:04:30,690 --> 00:04:34,920 They ended up with a woman leader. 68 00:04:34,920 --> 00:04:46,480 Now, in terms of outcomes, there's data that therefore 69 00:04:46,480 --> 00:04:48,620 allows you to compare the villages that had a woman 70 00:04:48,620 --> 00:04:50,530 leader with places that didn't. 71 00:04:50,530 --> 00:04:53,710 And there's an outcome that seems clear that woman leaders 72 00:04:53,710 --> 00:04:56,520 are not obviously worse than men leaders. 73 00:04:56,520 --> 00:05:00,115 They seem to spend about the same amount of money. 74 00:05:03,930 --> 00:05:07,140 And when you ask villagers their view of corruption in 75 00:05:07,140 --> 00:05:09,550 government, they're less likely to say the government 76 00:05:09,550 --> 00:05:10,100 is corrupt. 77 00:05:10,100 --> 00:05:14,190 So overall if you look at just the numbers, women seem to be 78 00:05:14,190 --> 00:05:16,270 just as effective as men. 79 00:05:16,270 --> 00:05:18,160 Now, this is not obvious. 80 00:05:18,160 --> 00:05:22,770 Indeed, it's quite surprising because the women are actually 81 00:05:22,770 --> 00:05:24,020 less experienced. 82 00:05:26,170 --> 00:05:30,780 Often this is the first time they've had a political job. 83 00:05:30,780 --> 00:05:33,360 Many of them, this is not just the first time they've had a 84 00:05:33,360 --> 00:05:33,790 political job. 85 00:05:33,790 --> 00:05:35,520 This is the first time they've ever had a job. 86 00:05:35,520 --> 00:05:38,440 They're less educated than the men, and they're less likely 87 00:05:38,440 --> 00:05:41,010 to have had run a business than the men. 88 00:05:41,010 --> 00:05:41,960 That's not surprising. 89 00:05:41,960 --> 00:05:44,930 But given all that, that's often all of those things that 90 00:05:44,930 --> 00:05:47,420 often are seen as being attributes 91 00:05:47,420 --> 00:05:49,450 of successful leaders. 92 00:05:49,450 --> 00:05:53,930 It's not surprising maybe that people still sort of feel that 93 00:05:53,930 --> 00:05:58,920 they should not be able to lead because people only have 94 00:05:58,920 --> 00:06:03,350 a theory, which is often kind of a common theory, which is 95 00:06:03,350 --> 00:06:06,870 that if you have been a business leader or you've had 96 00:06:06,870 --> 00:06:10,730 a business you might be better at running government. 97 00:06:10,730 --> 00:06:15,110 And so this is a theory you encounter in the press here 98 00:06:15,110 --> 00:06:15,750 all the time. 99 00:06:15,750 --> 00:06:21,680 So maybe people might very well believe that. 100 00:06:32,760 --> 00:06:42,070 So the question that's being asked in this experiment-- 101 00:06:42,070 --> 00:06:48,790 so the first round of kind of this lottery, people observed 102 00:06:48,790 --> 00:06:49,430 what happened. 103 00:06:49,430 --> 00:06:57,280 And, as I said, the answer was that the women did no worse 104 00:06:57,280 --> 00:06:58,720 than the men. 105 00:06:58,720 --> 00:07:15,710 However, there were still some hitches in the system. 106 00:07:15,710 --> 00:07:23,480 So on the next round what was supposed to have happened was 107 00:07:23,480 --> 00:07:27,250 that now there would be no lottery. 108 00:07:27,250 --> 00:07:29,580 People would just reverse. 109 00:07:29,580 --> 00:07:32,580 And so let's say you had a woman leader before, now you 110 00:07:32,580 --> 00:07:35,110 wouldn't have one and vice versa. 111 00:07:35,110 --> 00:07:36,970 That was what supposed to happen. 112 00:07:36,970 --> 00:07:38,680 In fact, that didn't happen. 113 00:07:38,680 --> 00:07:41,780 What happened was that there was some political 114 00:07:41,780 --> 00:07:42,720 negotiation. 115 00:07:42,720 --> 00:07:44,950 And they had a second lottery. 116 00:07:44,950 --> 00:07:49,040 So there were places that were randomly chosen again. 117 00:07:49,040 --> 00:07:52,330 And because there was a second lottery, some places had women 118 00:07:52,330 --> 00:07:57,750 leaders twice just because they happened to kind of win 119 00:07:57,750 --> 00:07:58,920 the lottery twice. 120 00:07:58,920 --> 00:08:01,230 Some of them had a lose, depending on 121 00:08:01,230 --> 00:08:02,610 your point of view. 122 00:08:02,610 --> 00:08:05,040 Some of them had women leaders once. 123 00:08:05,040 --> 00:08:07,100 And some of them had zero times. 124 00:08:07,100 --> 00:08:10,840 So this data is after 10 years. 125 00:08:10,840 --> 00:08:15,250 So the data we're going to look at is at least after more 126 00:08:15,250 --> 00:08:16,100 than five years. 127 00:08:16,100 --> 00:08:18,950 So these are places which have had a women 128 00:08:18,950 --> 00:08:22,050 leader for several years. 129 00:08:22,050 --> 00:08:25,880 So they've had her first for five years and then possibly 130 00:08:25,880 --> 00:08:27,170 for another couple of years. 131 00:08:27,170 --> 00:08:28,370 Or they've had men leaders. 132 00:08:28,370 --> 00:08:33,429 So this is sort of data after a fairly long period of time. 133 00:08:33,429 --> 00:08:36,200 And, as I said, there are three different groups here. 134 00:08:36,200 --> 00:08:39,870 There are people who've had women leaders for two terms or 135 00:08:39,870 --> 00:08:43,020 at least one and one plus term, people who have had 136 00:08:43,020 --> 00:08:45,910 women leaders for one term, and people who have had women 137 00:08:45,910 --> 00:08:47,510 leaders for zero terms. 138 00:08:47,510 --> 00:08:51,970 So that's the comparison we're going to be interested in. 139 00:08:51,970 --> 00:08:57,340 So the way the comparison will be done is interesting. 140 00:08:57,340 --> 00:09:06,110 So they took a real speech that was given by a leader, a 141 00:09:06,110 --> 00:09:08,810 man leader in this case. 142 00:09:08,810 --> 00:09:13,220 And then they had a woman sort of speak out the same speech. 143 00:09:13,220 --> 00:09:16,250 So they had two recordings of the same speech, identical 144 00:09:16,250 --> 00:09:19,930 speech, one in a woman's voice and one in a man's voice. 145 00:09:19,930 --> 00:09:24,170 And then what they did was they took these two speeches 146 00:09:24,170 --> 00:09:29,110 to the villagers and asked them to, you know, and took, 147 00:09:29,110 --> 00:09:33,380 like, a random sample of villagers and asked them to 148 00:09:33,380 --> 00:09:37,930 say which of the speeches they thought reflected leader's 149 00:09:37,930 --> 00:09:39,180 competence. 150 00:09:41,270 --> 00:09:43,120 They were hearing a man speak, a woman speak. 151 00:09:43,120 --> 00:09:45,420 They were hearing exactly the same speech. 152 00:09:45,420 --> 00:09:48,180 And you're asking them which one sounds more competent. 153 00:09:48,180 --> 00:09:50,610 So some people were asked about men, others were asked 154 00:09:50,610 --> 00:09:52,210 about women. 155 00:09:52,210 --> 00:09:57,270 And the idea was precisely to see if it is the case that the 156 00:09:57,270 --> 00:10:01,760 same speech gets graded differently, so if people are 157 00:10:01,760 --> 00:10:03,800 bringing something to the speech which 158 00:10:03,800 --> 00:10:05,380 is not in the speech. 159 00:10:05,380 --> 00:10:10,400 The speech is just a voice saying the same words. 160 00:10:10,400 --> 00:10:12,280 Are you bringing something different to it? 161 00:10:12,280 --> 00:10:14,350 You're bringing some prejudice to it? 162 00:10:14,350 --> 00:10:18,770 You think that women are better, women are worse, 163 00:10:18,770 --> 00:10:19,670 whatever it is. 164 00:10:19,670 --> 00:10:21,920 This is an attempt to measure prejudice. 165 00:10:21,920 --> 00:10:22,564 Yeah. 166 00:10:22,564 --> 00:10:24,016 AUDIENCE: I have a quick question. 167 00:10:24,016 --> 00:10:25,952 If a woman's a leader do people in the community think 168 00:10:25,952 --> 00:10:28,380 she does whatever her husband tells her to do? 169 00:10:28,380 --> 00:10:28,770 PROFESSOR: Yeah. 170 00:10:28,770 --> 00:10:33,770 For example, that's an interesting question. 171 00:10:33,770 --> 00:10:39,030 So when this was implemented, people basically said this is 172 00:10:39,030 --> 00:10:40,920 not going to change anything. 173 00:10:40,920 --> 00:10:43,410 Women leaders are going to be just like men leaders. 174 00:10:43,410 --> 00:10:43,800 Why? 175 00:10:43,800 --> 00:10:45,780 Because men will lead in any case. 176 00:10:45,780 --> 00:10:48,420 The women just will be decorative. 177 00:10:48,420 --> 00:10:51,520 They'll be on the ballot but not actually leading. 178 00:10:51,520 --> 00:10:57,280 And indeed in many villages there is somebody who people 179 00:10:57,280 --> 00:11:00,910 talk about quite casually as the husband of the leader. 180 00:11:00,910 --> 00:11:04,240 They have a word for it in Hindi, which is the husband of 181 00:11:04,240 --> 00:11:05,180 the leader. 182 00:11:05,180 --> 00:11:10,780 And that's kind of like First Lad or something. 183 00:11:18,070 --> 00:11:20,330 So people thought this will have no effect. 184 00:11:20,330 --> 00:11:23,490 In fact, when you look at the data, it's clear that places 185 00:11:23,490 --> 00:11:26,740 that were randomly assigned a woman leader are much more 186 00:11:26,740 --> 00:11:32,330 likely to have investments in things women care about. 187 00:11:32,330 --> 00:11:34,580 So women mostly care about water. 188 00:11:34,580 --> 00:11:37,690 And places where women are leader have more 189 00:11:37,690 --> 00:11:39,320 investment in water. 190 00:11:39,320 --> 00:11:42,450 Water's a big woman issue because women have to find 191 00:11:42,450 --> 00:11:43,600 drinking water. 192 00:11:43,600 --> 00:11:46,100 Let's say drinking water is a huge woman issue. 193 00:11:46,100 --> 00:11:52,480 And so when you look at the village meetings, women ask 194 00:11:52,480 --> 00:11:56,500 questions about water, drinking water. 195 00:11:56,500 --> 00:11:59,750 And then you look at what happens if a woman is the head 196 00:11:59,750 --> 00:12:03,190 of the village, you see exactly that, which is you see 197 00:12:03,190 --> 00:12:07,730 women leaders generate more. 198 00:12:11,360 --> 00:12:15,780 So going back to this experiment, they were 199 00:12:15,780 --> 00:12:18,900 basically asked to grade the speech. 200 00:12:18,900 --> 00:12:21,870 And the speech was, as I said, identical. 201 00:12:21,870 --> 00:12:26,210 So if you thought the man was more competent, the woman was 202 00:12:26,210 --> 00:12:29,270 more competent, you're bringing some data to it that 203 00:12:29,270 --> 00:12:31,010 was not in the speech itself. 204 00:12:31,010 --> 00:12:32,886 The speech was word for word identical. 205 00:13:07,660 --> 00:13:19,430 So this is the comparison mostly of never reserved 206 00:13:19,430 --> 00:13:20,870 versus twice reserved. 207 00:13:20,870 --> 00:13:24,170 OK, that's the simplest comparison, one that had a 208 00:13:24,170 --> 00:13:26,350 woman leader twice with one where it's 209 00:13:26,350 --> 00:13:29,250 never had a woman leader. 210 00:13:29,250 --> 00:13:36,470 The first set of columns, set of bars, I guess, are for men. 211 00:13:36,470 --> 00:13:39,290 The second set of bars are for women. 212 00:13:39,290 --> 00:13:43,480 So when it says reserved, male, that means a man in the 213 00:13:43,480 --> 00:13:46,080 village that has been reserved twice. 214 00:13:46,080 --> 00:13:47,330 What does the man say? 215 00:13:49,890 --> 00:13:57,480 And the difference is woman's speech versus 216 00:13:57,480 --> 00:14:00,740 man speech, the rating. 217 00:14:00,740 --> 00:14:02,460 And the result is quite dramatic. 218 00:14:02,460 --> 00:14:05,820 Men, in general, if you're never reserved, think the 219 00:14:05,820 --> 00:14:09,890 woman's speech is much worse. 220 00:14:09,890 --> 00:14:12,456 That's the blue bar. 221 00:14:12,456 --> 00:14:14,760 It's negative. 222 00:14:14,760 --> 00:14:19,400 Nonetheless, if you see what happens after it's reserved 223 00:14:19,400 --> 00:14:22,530 twice, the men change their mind. 224 00:14:22,530 --> 00:14:24,520 Now the woman's speech is better. 225 00:14:24,520 --> 00:14:30,320 For women, the original prejudice was smaller and the 226 00:14:30,320 --> 00:14:31,660 swing is smaller. 227 00:14:31,660 --> 00:14:35,380 That's not maybe surprising because women start with less 228 00:14:35,380 --> 00:14:37,970 prejudice and then they don't react as much as well. 229 00:14:52,700 --> 00:15:00,350 On the other hand, they also ask a different question. 230 00:15:00,350 --> 00:15:02,560 This was kind of a measure of competence, right? 231 00:15:02,560 --> 00:15:05,360 This was actually asking them does the 232 00:15:05,360 --> 00:15:06,580 person sound competent? 233 00:15:06,580 --> 00:15:09,380 If you'd actually asked them the question should a woman be 234 00:15:09,380 --> 00:15:13,520 a leader, they still say no. 235 00:15:13,520 --> 00:15:15,370 So it's not that they changed their mind. 236 00:15:15,370 --> 00:15:17,680 It's that objective judgment has changed. 237 00:15:17,680 --> 00:15:20,710 They feel less negative about women. 238 00:15:25,940 --> 00:15:27,860 The test is still against women. 239 00:15:27,860 --> 00:15:32,070 But somehow once you've been exposed to women leaders, you 240 00:15:32,070 --> 00:15:33,850 realize that they're just as good as men. 241 00:15:33,850 --> 00:15:39,070 You might feel that they're really not a good idea. 242 00:15:39,070 --> 00:15:40,750 Women should not be in this job. 243 00:15:40,750 --> 00:15:44,070 But when you look at who does the job, you figure that 244 00:15:44,070 --> 00:15:46,050 they're just as good as others. 245 00:15:46,050 --> 00:15:48,910 There's a similar experiment that was done in-- 246 00:15:53,090 --> 00:15:56,810 so this is sort of continuing that same data. 247 00:15:56,810 --> 00:15:58,410 This is actually election outcome. 248 00:15:58,410 --> 00:16:09,070 So this is comparing how what fraction of women get elected. 249 00:16:09,070 --> 00:16:13,400 Now, in unreserved seats-- 250 00:16:13,400 --> 00:16:17,920 so these were seats in 2008 that had been reserved once, 251 00:16:17,920 --> 00:16:19,970 twice, or zero times. 252 00:16:19,970 --> 00:16:21,570 That's what you're comparing. 253 00:16:21,570 --> 00:16:25,085 You're taking seats in 2008 that are not reserved. 254 00:16:27,620 --> 00:16:31,840 Now, some of these seats in '98 and 2003 were reserved. 255 00:16:31,840 --> 00:16:33,170 Some were reserved once. 256 00:16:33,170 --> 00:16:35,260 Some were reserved twice. 257 00:16:35,260 --> 00:16:38,270 This is the third election. 258 00:16:38,270 --> 00:16:39,770 So do you understand? 259 00:16:39,770 --> 00:16:43,370 So instead of asking what is your preferences, this is 260 00:16:43,370 --> 00:16:47,560 asking what did you observe? 261 00:16:47,560 --> 00:16:50,080 How did you vote actually? 262 00:16:50,080 --> 00:16:51,950 Did you support women? 263 00:16:51,950 --> 00:16:54,350 And, here, again you see the same pattern. 264 00:16:54,350 --> 00:16:55,750 These are two different jobs. 265 00:16:55,750 --> 00:16:57,170 So the greens are different jobs. 266 00:17:01,240 --> 00:17:05,950 So the Pradhan is the top leader. 267 00:17:05,950 --> 00:17:09,780 The ward councillor is the next level leader. 268 00:17:09,780 --> 00:17:12,050 And if you look at both, you just see that they 269 00:17:12,050 --> 00:17:13,180 just keep going up. 270 00:17:13,180 --> 00:17:16,050 So if you reserve more, you vote more for women. 271 00:17:16,050 --> 00:17:17,060 These are unreserved seats. 272 00:17:17,060 --> 00:17:20,200 So right now you're not required to vote for women. 273 00:17:20,200 --> 00:17:23,079 And if you start with the ones which are never 274 00:17:23,079 --> 00:17:24,609 reserved, you get 9%. 275 00:17:24,609 --> 00:17:27,619 The ones who you reserved twice, you get twice as much. 276 00:17:27,619 --> 00:17:34,510 So basically people's political judgments are 277 00:17:34,510 --> 00:17:39,915 substantially shifted by being reserved twice. 278 00:17:39,915 --> 00:17:40,865 Yeah. 279 00:17:40,865 --> 00:17:43,715 AUDIENCE: Is there any data on the quality of leadership 280 00:17:43,715 --> 00:17:48,270 between women who run in the non reserved village versus in 281 00:17:48,270 --> 00:17:49,520 a reserved village? 282 00:17:52,650 --> 00:17:55,280 PROFESSOR: There's certainly data on-- 283 00:17:58,830 --> 00:18:00,680 I don't know the answer to that question. 284 00:18:00,680 --> 00:18:03,280 My guess is that if women who won the unreserved villages 285 00:18:03,280 --> 00:18:04,720 are better. 286 00:18:04,720 --> 00:18:06,100 And that's what you'd expect. 287 00:18:06,100 --> 00:18:10,580 And I remember somebody saying that they're more educated 288 00:18:10,580 --> 00:18:11,850 definitely. 289 00:18:11,850 --> 00:18:16,420 So if you have the confidence to run without any reservation 290 00:18:16,420 --> 00:18:19,860 then you are probably better situated in any case. 291 00:18:19,860 --> 00:18:21,750 And that seems to be true. 292 00:18:21,750 --> 00:18:22,930 I don't remember. 293 00:18:22,930 --> 00:18:25,900 The same data that is used to say that women leaders are 294 00:18:25,900 --> 00:18:29,190 less corrupt than men leaders must also say something about 295 00:18:29,190 --> 00:18:31,520 reserved versus unreserved women. 296 00:18:31,520 --> 00:18:33,980 I just don't know the answer. 297 00:18:33,980 --> 00:18:34,940 Yeah. 298 00:18:34,940 --> 00:18:38,780 AUDIENCE: Could this just be because more women are running 299 00:18:38,780 --> 00:18:41,190 in areas [INAUDIBLE]? 300 00:18:41,190 --> 00:18:43,166 PROFESSOR: Oh, absolutely. 301 00:18:43,166 --> 00:18:45,040 But that's an outcome, right? 302 00:18:45,040 --> 00:18:46,230 Women are running. 303 00:18:46,230 --> 00:18:50,060 And so part of it is that they're more incumbent. 304 00:18:50,060 --> 00:18:53,560 But the fact is that, for whatever reason, once you 305 00:18:53,560 --> 00:18:56,870 expose people to the possibility that they can vote 306 00:18:56,870 --> 00:19:00,410 for women, they vote for women. 307 00:19:00,410 --> 00:19:03,090 So in other words, policies change politics. 308 00:19:03,090 --> 00:19:10,130 That's the point I wanted to make, which was sort of 309 00:19:10,130 --> 00:19:12,540 something I've been pushing for a while, but just to 310 00:19:12,540 --> 00:19:16,050 understand this we talked about. 311 00:19:16,050 --> 00:19:23,480 And so basically to summarize it's not that political 312 00:19:23,480 --> 00:19:25,610 constraints are not important. 313 00:19:25,610 --> 00:19:29,570 And lots of political events have huge consequences. 314 00:19:29,570 --> 00:19:31,990 But that shouldn't stop us. 315 00:19:31,990 --> 00:19:34,030 The main point we're making here is that that shouldn't 316 00:19:34,030 --> 00:19:37,920 stop us in the fight for looking for better policies 317 00:19:37,920 --> 00:19:43,620 because, A, better policies are implementable even when 318 00:19:43,620 --> 00:19:46,860 the politics is often bad. 319 00:19:46,860 --> 00:19:49,250 You often still get some good policies. 320 00:19:49,250 --> 00:19:53,230 And second, if you have got a chance to have a good policy, 321 00:19:53,230 --> 00:19:54,890 you better have one ready. 322 00:19:54,890 --> 00:19:57,350 So imagine the politics shifted and somebody asked 323 00:19:57,350 --> 00:19:59,410 you, this is what I want to do. 324 00:19:59,410 --> 00:20:01,030 What would you do? 325 00:20:01,030 --> 00:20:06,930 You would do something very, very, very, different if you 326 00:20:06,930 --> 00:20:08,710 knew or if you thought about it. 327 00:20:12,440 --> 00:20:20,280 And, moreover, if you do undertake better policies, 328 00:20:20,280 --> 00:20:21,540 voters react. 329 00:20:21,540 --> 00:20:31,000 So there's no fixed prejudice that makes voters completely 330 00:20:31,000 --> 00:20:33,330 unresponsive to performance. 331 00:20:33,330 --> 00:20:36,800 If you show performance, voters seem to react. 332 00:20:40,100 --> 00:20:42,810 And especially if there's strong evidence that the 333 00:20:42,810 --> 00:20:47,640 policy is working, people react even more. 334 00:20:47,640 --> 00:20:53,020 So the one story that I think is on this slide and is worth 335 00:20:53,020 --> 00:20:58,880 mentioning is the story of a program called Progresa, which 336 00:20:58,880 --> 00:21:02,750 is a Mexican program which has now been imitated in many, 337 00:21:02,750 --> 00:21:06,810 many countries, including, briefly, by Mayor Bloomberg in 338 00:21:06,810 --> 00:21:07,910 New York City. 339 00:21:07,910 --> 00:21:12,500 This is a program called a conditional cash transfer, 340 00:21:12,500 --> 00:21:17,080 basically a program which told parents that if their children 341 00:21:17,080 --> 00:21:20,960 were immunized and sent to school, they'll get a little 342 00:21:20,960 --> 00:21:23,150 bit of money or quite a bit of money, actually. 343 00:21:23,150 --> 00:21:26,385 So they were paying parents to make sure that the parents 344 00:21:26,385 --> 00:21:27,680 send their children to school. 345 00:21:30,420 --> 00:21:32,650 So when this program was launched, the person who 346 00:21:32,650 --> 00:21:36,580 launched it realized that in Mexico every six years the 347 00:21:36,580 --> 00:21:38,990 president changes and all the policies change. 348 00:21:38,990 --> 00:21:42,760 And so the policy would die in six years even if it was a 349 00:21:42,760 --> 00:21:43,770 great policy. 350 00:21:43,770 --> 00:21:47,470 So what he did is he actually got a bunch of people from 351 00:21:47,470 --> 00:21:50,380 University of Pennsylvania, University of California 352 00:21:50,380 --> 00:21:52,950 involved in evaluating the program. 353 00:21:52,950 --> 00:21:56,450 And it was introduced in a random way so you could 354 00:21:56,450 --> 00:21:59,677 compare the places where the policies was implemented to 355 00:21:59,677 --> 00:22:01,250 the places that were not. 356 00:22:01,250 --> 00:22:04,530 And there was a bunch of well known scholars from all over 357 00:22:04,530 --> 00:22:06,370 the world who were evaluating it. 358 00:22:06,370 --> 00:22:08,620 Their results came out very positive. 359 00:22:08,620 --> 00:22:14,950 So as the guy who had implemented it expected, the 360 00:22:14,950 --> 00:22:19,440 party that was in power then lost the elections. 361 00:22:19,440 --> 00:22:22,250 So a new president came in from a different party. 362 00:22:22,250 --> 00:22:25,780 And he immediately said we have to shut down Progresa 363 00:22:25,780 --> 00:22:29,420 because Progresa is a previous government's program. 364 00:22:29,420 --> 00:22:31,030 We don't like it. 365 00:22:31,030 --> 00:22:32,420 So it was shut down. 366 00:22:32,420 --> 00:22:35,630 But immediately he replaced it with a slightly different 367 00:22:35,630 --> 00:22:38,620 program which was essentially identical to it called 368 00:22:38,620 --> 00:22:40,070 Oportunidades. 369 00:22:40,070 --> 00:22:44,730 So basically the fact that there was all this evidence 370 00:22:44,730 --> 00:22:47,790 out there made it very difficult for them to actually 371 00:22:47,790 --> 00:22:48,850 shut it down. 372 00:22:48,850 --> 00:22:52,590 And so the program actually survived and now has spread to 373 00:22:52,590 --> 00:22:54,870 many countries. 374 00:22:54,870 --> 00:23:02,560 So the general point to be made here is that I think it's 375 00:23:02,560 --> 00:23:06,850 too easy to assume that, well, the political 376 00:23:06,850 --> 00:23:10,180 system is very powerful. 377 00:23:10,180 --> 00:23:13,180 We just cannot change anything. 378 00:23:13,180 --> 00:23:17,050 And, therefore, we shouldn't intervene. 379 00:23:17,050 --> 00:23:20,690 But, in fact, when you actually are strategic about 380 00:23:20,690 --> 00:23:23,840 playing that game, you do get lots of leverage. 381 00:23:23,840 --> 00:23:27,150 This Progresa example is an excellent example of how you 382 00:23:27,150 --> 00:23:30,760 need to be strategic about the user information. 383 00:23:30,760 --> 00:23:35,840 But if you are strategic then you might be capable of 384 00:23:35,840 --> 00:23:37,410 generating lots of change. 385 00:23:37,410 --> 00:23:40,500 So that's sort of one general point I wanted to make. 386 00:23:43,080 --> 00:23:49,430 But I have a set of slides which are an attempt to 387 00:23:49,430 --> 00:23:51,780 summarize all the general points we wanted to make. 388 00:23:51,780 --> 00:23:54,760 So let me just go through them. 389 00:23:59,010 --> 00:24:03,640 So what I want to do is this is the heading. 390 00:24:03,640 --> 00:24:07,070 The end of our book is called "in place of a sweeping 391 00:24:07,070 --> 00:24:10,460 conclusion." And you can imagine why we don't want a 392 00:24:10,460 --> 00:24:13,240 sweeping conclusion since our whole point is that there is 393 00:24:13,240 --> 00:24:14,490 not one conclusion. 394 00:24:19,770 --> 00:24:21,960 There's not one solution that will solve all the 395 00:24:21,960 --> 00:24:22,880 problems in the world. 396 00:24:22,880 --> 00:24:27,200 There's no sense in claiming that there is one answer. 397 00:24:27,200 --> 00:24:30,080 However it's sort of useful to illustrate some general 398 00:24:30,080 --> 00:24:31,170 principles. 399 00:24:31,170 --> 00:24:33,910 And so what I'm going to do is I'm going to go through a few 400 00:24:33,910 --> 00:24:40,060 of, I think, the five key principles that we think are 401 00:24:40,060 --> 00:24:42,480 useful to keep in mind whenever you're thinking about 402 00:24:42,480 --> 00:24:43,640 these issues. 403 00:24:43,640 --> 00:24:46,730 So this is the first one. 404 00:24:46,730 --> 00:24:48,870 So do you remember any examples of this? 405 00:24:56,510 --> 00:24:57,450 Yeah. 406 00:24:57,450 --> 00:25:01,125 AUDIENCE: I think it was they believe, for health and 407 00:25:01,125 --> 00:25:05,780 disease stuff, they believe that the suggestions the 408 00:25:05,780 --> 00:25:07,985 religious leaders were giving were better than 409 00:25:07,985 --> 00:25:09,790 the medicines that-- 410 00:25:09,790 --> 00:25:11,150 PROFESSOR: For example. 411 00:25:11,150 --> 00:25:12,790 Others? 412 00:25:12,790 --> 00:25:14,040 There were many. 413 00:25:15,990 --> 00:25:16,885 Sorry? 414 00:25:16,885 --> 00:25:18,135 AUDIENCE: [INAUDIBLE]. 415 00:25:20,660 --> 00:25:24,640 PROFESSOR: What about examples of situations where the poor 416 00:25:24,640 --> 00:25:27,710 seem to act as if they don't have key pieces of information 417 00:25:27,710 --> 00:25:31,710 or they have beliefs which are sort of maybe misplaced? 418 00:25:34,650 --> 00:25:37,694 We went over many during the-- yeah. 419 00:25:37,694 --> 00:25:40,786 AUDIENCE: Education has an S-shaved curve to it. 420 00:25:40,786 --> 00:25:41,200 PROFESSOR: Right. 421 00:25:41,200 --> 00:25:45,260 That education has very high returns at the top and almost 422 00:25:45,260 --> 00:25:46,678 no returns at the bottom. 423 00:25:52,486 --> 00:25:52,970 Yeah. 424 00:25:52,970 --> 00:25:55,390 AUDIENCE: Probably make more sense to invest in education 425 00:25:55,390 --> 00:25:57,340 for women [INAUDIBLE]. 426 00:25:57,340 --> 00:25:58,590 PROFESSOR: Yeah. 427 00:26:21,860 --> 00:26:26,600 We talked about, for example, the belief that you should get 428 00:26:26,600 --> 00:26:31,820 a shot rather than a tablet whenever you get sick. 429 00:26:31,820 --> 00:26:34,820 Typically a tablet is much safer than a shot and does 430 00:26:34,820 --> 00:26:36,070 exactly the same thing. 431 00:26:51,250 --> 00:26:52,500 What's another example? 432 00:27:05,480 --> 00:27:08,540 The other examples that we discussed a lot are examples 433 00:27:08,540 --> 00:27:14,060 not so much of the wrong belief as much as the lack of 434 00:27:14,060 --> 00:27:17,820 belief, that people often don't seem to believe that 435 00:27:17,820 --> 00:27:24,730 different technologies are as effective as we think. 436 00:27:24,730 --> 00:27:27,590 So one of the reasons why people don't want bed nets is 437 00:27:27,590 --> 00:27:30,380 they don't seem to believe that bed nets will work. 438 00:27:30,380 --> 00:27:35,360 And when you give people bed nets and they use them, a year 439 00:27:35,360 --> 00:27:38,050 later they're willing to buy it at full price. 440 00:27:38,050 --> 00:27:40,520 A lot more of them are willing to buy it at full price 441 00:27:40,520 --> 00:27:45,300 because they see that they're useful and usable. 442 00:27:45,300 --> 00:27:48,050 The fact that you go and tell them this is great for you is 443 00:27:48,050 --> 00:27:48,850 not enough. 444 00:27:48,850 --> 00:27:49,710 They don't believe you. 445 00:27:49,710 --> 00:27:53,300 So when they use it, they start believing you. 446 00:27:53,300 --> 00:28:02,340 Or another example of rather weak beliefs is immunization. 447 00:28:02,340 --> 00:28:05,590 People sort of think that immunization, maybe it works, 448 00:28:05,590 --> 00:28:06,770 maybe it doesn't. 449 00:28:06,770 --> 00:28:09,840 It's not that they don't go to get immunized, but they don't 450 00:28:09,840 --> 00:28:12,550 really make sure that their children get all the 451 00:28:12,550 --> 00:28:13,240 immunizations. 452 00:28:13,240 --> 00:28:13,704 Yeah. 453 00:28:13,704 --> 00:28:17,416 AUDIENCE: Well, fertilizer is kind of a counter example in 454 00:28:17,416 --> 00:28:20,710 that they know it works, but then they try [INAUDIBLE]. 455 00:28:20,710 --> 00:28:21,350 PROFESSOR: Absolutely. 456 00:28:21,350 --> 00:28:25,620 So the converse was not being claimed. 457 00:28:25,620 --> 00:28:27,520 It's not that the poor only believe 458 00:28:27,520 --> 00:28:28,770 things which are false. 459 00:28:32,370 --> 00:28:36,210 AUDIENCE: That was just a lack of information [INAUDIBLE]. 460 00:28:36,210 --> 00:28:38,360 PROFESSOR: Yeah. 461 00:28:38,360 --> 00:28:40,800 You're right. 462 00:28:40,800 --> 00:28:42,410 It's a good example. 463 00:28:42,410 --> 00:28:45,710 But it is something where that's certainly not what we 464 00:28:45,710 --> 00:28:46,960 were claiming. 465 00:28:49,940 --> 00:28:57,800 The second one is we say that the poor bear responsibility 466 00:28:57,800 --> 00:29:00,270 for far too much of their lives. 467 00:29:00,270 --> 00:29:06,610 And what do we mean here? 468 00:29:06,610 --> 00:29:07,860 What are examples? 469 00:29:10,376 --> 00:29:12,340 AUDIENCE: There are examples in health. 470 00:29:12,340 --> 00:29:16,268 For instance, they have to decide every time about 471 00:29:16,268 --> 00:29:18,723 purification of water. 472 00:29:18,723 --> 00:29:22,160 And [INAUDIBLE] 473 00:29:22,160 --> 00:29:23,410 many examples. 474 00:29:27,070 --> 00:29:28,740 AUDIENCE: I was going to mention those, too. 475 00:29:28,740 --> 00:29:30,720 But also maybe education. 476 00:29:30,720 --> 00:29:32,712 You have a system of [INAUDIBLE] 477 00:29:32,712 --> 00:29:36,664 school systems working well in poor villages [INAUDIBLE] 478 00:29:36,664 --> 00:29:40,780 no idea that there's children not learning anything actually 479 00:29:40,780 --> 00:29:43,086 at school [INAUDIBLE]. 480 00:29:43,086 --> 00:29:43,580 PROFESSOR: Exactly. 481 00:29:43,580 --> 00:29:44,960 That's a nice example. 482 00:29:44,960 --> 00:29:47,470 Water's a very good example. 483 00:29:47,470 --> 00:29:51,490 Typically I think lots of people have said why don't 484 00:29:51,490 --> 00:29:56,480 poor people use, like, water purification? 485 00:29:56,480 --> 00:29:57,980 And it's true that they should use it. 486 00:29:57,980 --> 00:29:59,140 But think of what it means. 487 00:29:59,140 --> 00:30:02,680 It means every time you use water, you have to make sure 488 00:30:02,680 --> 00:30:03,920 it's purified. 489 00:30:03,920 --> 00:30:07,020 Whereas for us, we just don't even worry about it. 490 00:30:07,020 --> 00:30:08,090 We run the tap. 491 00:30:08,090 --> 00:30:11,920 So in some sense we have it so easy. 492 00:30:11,920 --> 00:30:14,910 So think about pensions. 493 00:30:14,910 --> 00:30:20,360 If you work at MIT, somebody deducts pension money from 494 00:30:20,360 --> 00:30:23,600 your salary, puts it into an account, and somebody else 495 00:30:23,600 --> 00:30:25,130 invests it for you. 496 00:30:25,130 --> 00:30:26,500 And you have no choice over it. 497 00:30:26,500 --> 00:30:30,360 But on the other hand, you know that something's being 498 00:30:30,360 --> 00:30:31,320 done about it. 499 00:30:31,320 --> 00:30:34,260 If you're poor, you have to invest the money all by 500 00:30:34,260 --> 00:30:37,430 yourself and make sure that, you know, it's getting the 501 00:30:37,430 --> 00:30:40,670 right returns or finding a place that's safe. 502 00:30:40,670 --> 00:30:43,250 We don't solve any of those problems. 503 00:30:43,250 --> 00:30:46,060 Education's a great one actually. 504 00:30:46,060 --> 00:30:50,610 I think we can take as given that the 505 00:30:50,610 --> 00:30:52,400 school systems are rated. 506 00:30:52,400 --> 00:30:55,520 We have the ability to evaluate what our 507 00:30:55,520 --> 00:30:56,720 children are learning. 508 00:30:56,720 --> 00:30:59,050 If you're poor then you really don't know. 509 00:30:59,050 --> 00:31:01,170 If you want to actually make sure your child is learning, 510 00:31:01,170 --> 00:31:04,270 you have to go and find somebody else who does know 511 00:31:04,270 --> 00:31:13,090 how to read and write and get him to intervene and make sure 512 00:31:13,090 --> 00:31:15,580 that the child is learning. 513 00:31:15,580 --> 00:31:16,830 Do you want-- 514 00:31:18,780 --> 00:31:21,190 yeah? 515 00:31:21,190 --> 00:31:22,944 AUDIENCE: I don't know if this falls under what [INAUDIBLE]. 516 00:31:22,944 --> 00:31:23,932 It probably does. 517 00:31:23,932 --> 00:31:29,366 But insurance because I think bearing responsibility for 518 00:31:29,366 --> 00:31:33,812 major things that are external to their future [INAUDIBLE], 519 00:31:33,812 --> 00:31:37,770 whereas if you bring [INAUDIBLE]. 520 00:31:37,770 --> 00:31:38,110 PROFESSOR: Right. 521 00:31:38,110 --> 00:31:41,590 So that's absolutely right. 522 00:31:41,590 --> 00:31:48,170 In the US, if there's a big storm then FEMA steps in. 523 00:31:48,170 --> 00:31:51,330 And sometimes it doesn't work so well like in New Orleans. 524 00:31:51,330 --> 00:31:56,130 But it's on average that there's a huge number of 525 00:31:56,130 --> 00:32:01,050 federal agencies that are there to guarantee you some 526 00:32:01,050 --> 00:32:04,760 minimum living and to help you rebuild your houses and all 527 00:32:04,760 --> 00:32:06,480 that disaster relief. 528 00:32:06,480 --> 00:32:13,480 If the same storm happens in some less well run country 529 00:32:13,480 --> 00:32:15,870 then in some sense you know that is 530 00:32:15,870 --> 00:32:16,840 going to be your problem. 531 00:32:16,840 --> 00:32:21,040 So you spend all your time worrying about what will you 532 00:32:21,040 --> 00:32:22,200 have to do about it. 533 00:32:22,200 --> 00:32:26,400 In the US, you basically know that if it's something that's 534 00:32:26,400 --> 00:32:31,480 really out of control then it will be taken care of mostly. 535 00:32:31,480 --> 00:32:35,730 And that's even more true in other countries. 536 00:32:35,730 --> 00:32:41,036 Among rich countries, the US is particularly laissez faire. 537 00:32:41,036 --> 00:32:42,964 AUDIENCE: [INAUDIBLE] you're saying [INAUDIBLE] 538 00:32:42,964 --> 00:32:44,892 poor people than poor countries. 539 00:32:44,892 --> 00:32:47,662 Because I think that a lot of this stuff, this also does 540 00:32:47,662 --> 00:32:50,200 apply [INAUDIBLE]. 541 00:32:50,200 --> 00:32:50,420 PROFESSOR: Absolutely. 542 00:32:50,420 --> 00:32:53,950 So I think poor people in rich countries have many of the 543 00:32:53,950 --> 00:32:56,810 same issues. 544 00:32:56,810 --> 00:33:00,720 For example, if you don't have a regular job, you don't get 545 00:33:00,720 --> 00:33:01,580 health insurance. 546 00:33:01,580 --> 00:33:02,830 You don't get pensions. 547 00:33:05,480 --> 00:33:08,240 So you have to then decide to do your savings on your own. 548 00:33:08,240 --> 00:33:15,522 You have to find a way to fund your health problems on your 549 00:33:15,522 --> 00:33:17,380 own, et cetera. 550 00:33:17,380 --> 00:33:20,310 You bear much more response-- for an MIT professor, so many 551 00:33:20,310 --> 00:33:23,010 things are taken care of, which if you were poor, they 552 00:33:23,010 --> 00:33:23,600 wouldn't be. 553 00:33:23,600 --> 00:33:24,850 So that's absolutely right. 554 00:33:27,330 --> 00:33:32,740 This is also true, as Melissa said, of poor people all over 555 00:33:32,740 --> 00:33:34,380 the world, including the US. 556 00:33:34,380 --> 00:33:37,830 Markets will work less well for poor people. 557 00:33:42,200 --> 00:33:45,430 And the corollary of that is therefore you should not 558 00:33:45,430 --> 00:33:49,130 always pin your fate on markets delivering. 559 00:33:49,130 --> 00:33:53,380 So what's an example of markets working less well for 560 00:33:53,380 --> 00:33:58,360 people and why would they work less well? 561 00:33:58,360 --> 00:33:59,285 Yeah. 562 00:33:59,285 --> 00:34:01,265 AUDIENCE: The insurance market, performing [INAUDIBLE] 563 00:34:01,265 --> 00:34:03,740 just because the information [INAUDIBLE] 564 00:34:03,740 --> 00:34:08,195 to form institutions in some of, I guess, the [INAUDIBLE]. 565 00:34:08,195 --> 00:34:10,670 So that's why they form more informal insurance markets 566 00:34:10,670 --> 00:34:14,465 where family members and community members try to 567 00:34:14,465 --> 00:34:16,610 insure against a bad shock. 568 00:34:16,610 --> 00:34:17,179 PROFESSOR: Right. 569 00:34:17,179 --> 00:34:19,429 That's a great example. 570 00:34:19,429 --> 00:34:23,830 You don't really have access to formal insurance because 571 00:34:23,830 --> 00:34:27,030 maybe you are just not valuable enough and you're too 572 00:34:27,030 --> 00:34:28,619 far away from the insurance company. 573 00:34:40,673 --> 00:34:41,655 Yeah. 574 00:34:41,655 --> 00:34:44,601 AUDIENCE: I guess the mosquito net market because they don't 575 00:34:44,601 --> 00:34:46,319 see the importance or the value of 576 00:34:46,319 --> 00:34:48,038 buying a mosquito net. 577 00:34:48,038 --> 00:34:51,475 So you have to give it away first to convince them of its 578 00:34:51,475 --> 00:34:51,966 importance. 579 00:34:51,966 --> 00:34:56,400 But you also have to make sure to give it away [INAUDIBLE]. 580 00:34:56,400 --> 00:34:57,950 PROFESSOR: They don't mostly. 581 00:34:57,950 --> 00:34:59,500 I think we talked about the evidence [INAUDIBLE]. 582 00:35:06,346 --> 00:35:07,324 Yep. 583 00:35:07,324 --> 00:35:09,606 AUDIENCE: There's talk about the fertilizer market and how 584 00:35:09,606 --> 00:35:11,236 people might have had money at the right time when 585 00:35:11,236 --> 00:35:13,920 [INAUDIBLE]. 586 00:35:13,920 --> 00:35:18,880 PROFESSOR: If you have no place to sort of-- 587 00:35:18,880 --> 00:35:21,875 one thing that makes life very difficult for people in 588 00:35:21,875 --> 00:35:24,500 agriculture it that they get their entire 589 00:35:24,500 --> 00:35:25,822 salary all at once. 590 00:35:25,822 --> 00:35:28,840 And then they have to figure out a way to spend it smoothly 591 00:35:28,840 --> 00:35:32,010 over a period of time because they're going to get no money 592 00:35:32,010 --> 00:35:33,550 for a long time. 593 00:35:33,550 --> 00:35:36,000 We get paid every week or every month. 594 00:35:36,000 --> 00:35:41,110 So even if you spent the last month's income, we have this 595 00:35:41,110 --> 00:35:42,480 month's income coming in. 596 00:35:42,480 --> 00:35:45,210 If you have an income every six months, you have to have 597 00:35:45,210 --> 00:35:47,140 that much more self restraint. 598 00:35:47,140 --> 00:35:50,620 And so poor people are often farmers, and they have to deal 599 00:35:50,620 --> 00:35:54,340 with these very large ups and downs in incomes which we 600 00:35:54,340 --> 00:35:55,600 don't have a deal with at all. 601 00:36:09,781 --> 00:36:10,759 Yeah. 602 00:36:10,759 --> 00:36:12,226 AUDIENCE: Maybe of giving things away. 603 00:36:12,226 --> 00:36:14,426 You have to give away a bag of rice to get 604 00:36:14,426 --> 00:36:15,160 people to get vaccines. 605 00:36:15,160 --> 00:36:18,440 So it's more about the [INAUDIBLE]. 606 00:36:18,440 --> 00:36:19,670 PROFESSOR: Yeah. 607 00:36:19,670 --> 00:36:20,660 Exactly. 608 00:36:20,660 --> 00:36:22,780 Once the market doesn't work then you have to 609 00:36:22,780 --> 00:36:25,350 think of what to do. 610 00:36:25,350 --> 00:36:28,380 We talked about the credit market where in the credit 611 00:36:28,380 --> 00:36:36,930 market the reason why the poor end up paying much higher 612 00:36:36,930 --> 00:36:43,230 interest rates is because there's this vicious circle of 613 00:36:43,230 --> 00:36:46,300 if you are poor then nobody's going to lend 614 00:36:46,300 --> 00:36:47,890 you a lot of money. 615 00:36:47,890 --> 00:36:50,700 So since you're going to be lent a very small amount of 616 00:36:50,700 --> 00:36:56,770 money then the interest rate will have to cover the cost of 617 00:36:56,770 --> 00:36:58,160 enforcing the loan. 618 00:37:01,080 --> 00:37:03,200 And that's kind of a fixed cost. 619 00:37:03,200 --> 00:37:07,295 So if that's, let's say, $10 and you are borrowing $10 620 00:37:07,295 --> 00:37:09,710 million, then that's nothing. 621 00:37:09,710 --> 00:37:15,370 If you were borrowing $10, then that's, you know, 100%. 622 00:37:15,370 --> 00:37:20,810 So you get a much bigger kick from having 623 00:37:20,810 --> 00:37:22,250 high interest rates. 624 00:37:22,250 --> 00:37:26,140 And then it turns into this kind of further vicious cycle 625 00:37:26,140 --> 00:37:31,800 because if the interest rate is high then the incentive to 626 00:37:31,800 --> 00:37:37,330 default is higher and then you have to spend even more money 627 00:37:37,330 --> 00:37:39,110 making sure that people don't default. 628 00:37:39,110 --> 00:37:42,250 But that then raises the interest rate even more. 629 00:37:42,250 --> 00:37:46,360 So you get this vicious cycle of you start by being poor and 630 00:37:46,360 --> 00:37:49,510 you end up with a very high interest rate. 631 00:37:49,510 --> 00:37:53,515 That's one of the reasons why the poor find it very 632 00:37:53,515 --> 00:37:54,765 difficult to borrow. 633 00:38:08,700 --> 00:38:11,065 So this one we talked about already. 634 00:38:11,065 --> 00:38:15,750 This is the one we were talking about a lot. 635 00:38:15,750 --> 00:38:20,160 So I'll skip it and go to the last one, which is that a lot 636 00:38:20,160 --> 00:38:25,340 of what keeps people in poverty is just the 637 00:38:25,340 --> 00:38:30,790 expectations of hopelessness and 638 00:38:30,790 --> 00:38:34,880 expectations of negative outcomes. 639 00:38:34,880 --> 00:38:37,332 Do you want to give examples of that? 640 00:38:37,332 --> 00:38:37,823 Yeah. 641 00:38:37,823 --> 00:38:38,805 AUDIENCE: [INAUDIBLE] 642 00:38:38,805 --> 00:38:42,487 give them a test and then told them to stay for the test or 643 00:38:42,487 --> 00:38:44,640 not, like, making them aware [INAUDIBLE]. 644 00:38:44,640 --> 00:38:45,590 PROFESSOR: Louder. 645 00:38:45,590 --> 00:38:47,090 AUDIENCE: Basically, in education, students 646 00:38:47,090 --> 00:38:49,440 [INAUDIBLE]. 647 00:38:49,440 --> 00:38:49,840 PROFESSOR: Right. 648 00:38:49,840 --> 00:38:56,600 So students are often told that because they didn't do 649 00:38:56,600 --> 00:38:59,310 well in their first grade test that they're stupid. 650 00:38:59,310 --> 00:39:02,050 And if they believe they're stupid, they stop trying. 651 00:39:02,050 --> 00:39:03,300 What's another example? 652 00:39:12,713 --> 00:39:17,186 AUDIENCE: That, like, we saw that if the poor believe that 653 00:39:17,186 --> 00:39:19,671 they are not going to earn much from a particular 654 00:39:19,671 --> 00:39:23,150 profession then they will not pursue that [INAUDIBLE] 655 00:39:23,150 --> 00:39:24,170 business. 656 00:39:24,170 --> 00:39:24,550 PROFESSOR: Right. 657 00:39:24,550 --> 00:39:35,500 So if you think that the rewards that you're going to 658 00:39:35,500 --> 00:39:38,520 get from some investment are pretty small so your life 659 00:39:38,520 --> 00:39:41,540 won't really change in any substantial way then you might 660 00:39:41,540 --> 00:39:44,550 just give up and not try very much to do anything 661 00:39:44,550 --> 00:39:46,540 innovative. 662 00:39:46,540 --> 00:39:47,355 Yeah. 663 00:39:47,355 --> 00:39:50,564 AUDIENCE: There's also the savings example of always 664 00:39:50,564 --> 00:39:52,037 expecting [INAUDIBLE] 665 00:39:52,037 --> 00:39:57,438 about that can be suspended as who's qualified [INAUDIBLE]. 666 00:39:57,438 --> 00:39:59,893 Knowing that some things can come up that's going to 667 00:39:59,893 --> 00:40:03,350 require money gives an incentive to try to save. 668 00:40:03,350 --> 00:40:04,790 PROFESSOR: Right. 669 00:40:04,790 --> 00:40:07,490 So you might expect that somebody will tax it away 670 00:40:07,490 --> 00:40:10,080 somehow and then you'll never want to save. 671 00:40:13,020 --> 00:40:14,270 Any other examples? 672 00:40:27,810 --> 00:40:31,560 So I just wanted to then wrap up in a couple of minutes. 673 00:40:31,560 --> 00:40:33,080 Let me say a couple of things. 674 00:40:33,080 --> 00:40:35,060 I'm going to wrap up in a couple of minutes. 675 00:40:35,060 --> 00:40:40,960 One thing is that you should really give us the evaluation 676 00:40:40,960 --> 00:40:41,760 for this class. 677 00:40:41,760 --> 00:40:45,090 It's the first time we taught it in this format, and I have 678 00:40:45,090 --> 00:40:50,460 no idea what the specific problems are and 679 00:40:50,460 --> 00:40:51,660 what we could do better. 680 00:40:51,660 --> 00:40:53,970 So it will be helpful for us. 681 00:40:53,970 --> 00:41:01,970 And it's only fair to the two TAs who work very hard on this 682 00:41:01,970 --> 00:41:04,310 to get feedback on what they've done because that 683 00:41:04,310 --> 00:41:07,410 actually matters for, for example, when they go in the 684 00:41:07,410 --> 00:41:08,890 job market. 685 00:41:08,890 --> 00:41:11,040 This is something that matters. 686 00:41:11,040 --> 00:41:12,630 I know it's something easy to forget. 687 00:41:12,630 --> 00:41:15,460 But you should remember to do it. 688 00:41:18,220 --> 00:41:23,470 Other than that, I guess, just to wrap up, this was an 689 00:41:23,470 --> 00:41:32,640 attempt to kind of expose you to how people-- 690 00:41:32,640 --> 00:41:37,320 I think mostly people from sort of the community of 691 00:41:37,320 --> 00:41:41,440 people associated with the Poverty Action Lab but also, I 692 00:41:41,440 --> 00:41:44,730 think, development, economics more generally think about 693 00:41:44,730 --> 00:41:46,830 issues around poverty. 694 00:41:46,830 --> 00:41:49,140 And I think the thing that I think you should take away 695 00:41:49,140 --> 00:41:55,660 from this is not that we have solved the problem of poverty. 696 00:41:55,660 --> 00:41:59,190 I think that would be an overstatement by some margin. 697 00:41:59,190 --> 00:42:05,850 But at least that at this point that there's a huge gap 698 00:42:05,850 --> 00:42:10,800 between the way, kind of, the journalistic accounts of 699 00:42:10,800 --> 00:42:14,230 poverty operate, the level at which they operate and the 700 00:42:14,230 --> 00:42:19,990 level at which all of us now after this conversation should 701 00:42:19,990 --> 00:42:20,560 approach it. 702 00:42:20,560 --> 00:42:25,530 In some ways there is a very large and relatively stable 703 00:42:25,530 --> 00:42:29,110 body of knowledge about what the big issues are and what 704 00:42:29,110 --> 00:42:30,330 the sticking points are. 705 00:42:30,330 --> 00:42:32,440 There's a sense in which we are-- 706 00:42:32,440 --> 00:42:35,100 I don't know that we have all the answers or 707 00:42:35,100 --> 00:42:36,480 even all the problems. 708 00:42:36,480 --> 00:42:39,610 But at least some problems have been clearly identified. 709 00:42:39,610 --> 00:42:41,960 They keep coming back, the same problems. 710 00:42:41,960 --> 00:42:45,800 And we kind of know how to think about them, what can we 711 00:42:45,800 --> 00:42:46,320 do about it. 712 00:42:46,320 --> 00:42:50,290 In that sense there is a sense in which I think there's 713 00:42:50,290 --> 00:42:52,390 reason to be optimistic. 714 00:42:52,390 --> 00:42:55,690 If you think about something I keep saying, is that if you 715 00:42:55,690 --> 00:43:00,840 think about sort of social policy and policy against 716 00:43:00,840 --> 00:43:04,300 poverty, the first thing to remember is that this is an 717 00:43:04,300 --> 00:43:09,300 extraordinarily modern invention, that basically 718 00:43:09,300 --> 00:43:14,240 other than supporting the extremely destitute at some 719 00:43:14,240 --> 00:43:20,320 very low level there was no government policy to help the 720 00:43:20,320 --> 00:43:25,480 poor at all really ever. 721 00:43:25,480 --> 00:43:29,240 If you were extremely poor, somebody would feed you. 722 00:43:29,240 --> 00:43:31,010 But mostly there was nothing else. 723 00:43:31,010 --> 00:43:33,960 I mean the education systems really didn't exist except for 724 00:43:33,960 --> 00:43:34,890 the elites. 725 00:43:34,890 --> 00:43:38,750 Health systems didn't exist except in the private market. 726 00:43:38,750 --> 00:43:42,240 And other forms of insurance didn't exist. 727 00:43:42,240 --> 00:43:47,290 If you look at the world in 1900, the most striking fact 728 00:43:47,290 --> 00:43:50,340 about it is how small governments are. 729 00:43:50,340 --> 00:43:54,920 No government is spending more than 10% of GDP. 730 00:43:54,920 --> 00:43:59,500 Tax collections are maximum 10% of GDP. 731 00:43:59,500 --> 00:44:02,920 Now, if you take, like, Sweden, tax collections are 732 00:44:02,920 --> 00:44:04,730 55% of GDP-- 733 00:44:04,730 --> 00:44:07,230 so just to think of the range. 734 00:44:07,230 --> 00:44:12,100 So what has happened over the last 100 years, and actually 735 00:44:12,100 --> 00:44:16,510 not even 100 years, really after the First World War, was 736 00:44:16,510 --> 00:44:24,040 a massive expansion of things that governments do. 737 00:44:24,040 --> 00:44:27,530 And more generally a massive expansion of systematic 738 00:44:27,530 --> 00:44:33,780 interventions to help people live their lives better. 739 00:44:33,780 --> 00:44:37,720 Education systems massively expanded. 740 00:44:37,720 --> 00:44:39,210 Health care massively expanded. 741 00:44:42,270 --> 00:44:44,430 Social support programs massively expanded. 742 00:44:44,430 --> 00:44:50,930 In general the world of sort of interventions is a very 743 00:44:50,930 --> 00:44:54,480 modern world. 744 00:44:54,480 --> 00:44:57,870 This whole idea that there should be social policy to 745 00:44:57,870 --> 00:45:01,160 help the poor and all of us should worry about it is a 746 00:45:01,160 --> 00:45:03,050 very, very recent idea. 747 00:45:03,050 --> 00:45:05,280 So the fact that we haven't got it right 748 00:45:05,280 --> 00:45:07,200 already is no surprise. 749 00:45:07,200 --> 00:45:18,800 In some sense, what happened was even between 1920 or 1910 750 00:45:18,800 --> 00:45:22,740 and 1950, there's a massive expansion of governments in 751 00:45:22,740 --> 00:45:24,020 rich countries. 752 00:45:24,020 --> 00:45:26,840 Then a bunch of poor countries became independent and they 753 00:45:26,840 --> 00:45:30,840 imitated the rich countries by also substantially expanding 754 00:45:30,840 --> 00:45:33,680 their government. 755 00:45:33,680 --> 00:45:37,300 And then they often didn't have the experience, the state 756 00:45:37,300 --> 00:45:39,470 capacity, the bureaucratic capacity to 757 00:45:39,470 --> 00:45:41,020 implement those programs. 758 00:45:41,020 --> 00:45:43,960 So you've got a lot of things happening which shouldn't have 759 00:45:43,960 --> 00:45:45,730 happened, lots of disasters. 760 00:45:45,730 --> 00:45:51,550 But in my view that says very little. 761 00:45:51,550 --> 00:45:56,130 If you see that in the sweep of history, in the last 10,000 762 00:45:56,130 --> 00:46:01,830 years of history, this is a period of 30 years in 763 00:46:01,830 --> 00:46:04,710 developing countries between 1960 and 1990. 764 00:46:04,710 --> 00:46:06,240 Governments expanded a lot. 765 00:46:06,240 --> 00:46:08,140 They tried a bunch of new programs. 766 00:46:08,140 --> 00:46:09,030 Many of them failed. 767 00:46:09,030 --> 00:46:10,350 There was lots of corruption. 768 00:46:10,350 --> 00:46:13,420 But I think if you see the sweep of history, we are just 769 00:46:13,420 --> 00:46:17,970 at the beginning of a process of trying to create a more 770 00:46:17,970 --> 00:46:19,510 humane social policy. 771 00:46:19,510 --> 00:46:24,410 We're nowhere near the point where we would like to be. 772 00:46:24,410 --> 00:46:25,880 And this is no accident. 773 00:46:25,880 --> 00:46:27,120 We just started. 774 00:46:27,120 --> 00:46:29,350 And in that sense, if you want to take that perspective that 775 00:46:29,350 --> 00:46:32,660 this is really the beginning in terms of any historical 776 00:46:32,660 --> 00:46:36,890 perspective, we are kind of at a very early point, then I 777 00:46:36,890 --> 00:46:41,950 think the main lesson that we suggest here should seem more 778 00:46:41,950 --> 00:46:45,550 plausible, which is that we really don't know very much. 779 00:46:45,550 --> 00:46:46,930 We've been shooting in the dark. 780 00:46:46,930 --> 00:46:49,490 We expanded too fast. 781 00:46:49,490 --> 00:46:51,140 We didn't know what we were doing. 782 00:46:51,140 --> 00:46:55,930 And we're still very early in the process. 783 00:46:55,930 --> 00:46:59,790 And what we need to do is step back and figure out what works 784 00:46:59,790 --> 00:47:00,990 and how to make it work. 785 00:47:00,990 --> 00:47:03,290 And if you had to take one lesson away from 786 00:47:03,290 --> 00:47:04,740 this class, it's that. 787 00:47:04,740 --> 00:47:09,830 But there is a way to learn much better about what works 788 00:47:09,830 --> 00:47:10,570 and what doesn't. 789 00:47:10,570 --> 00:47:11,810 We haven't used it. 790 00:47:11,810 --> 00:47:14,210 And that's no accident because we are just 791 00:47:14,210 --> 00:47:16,340 getting started really. 792 00:47:16,340 --> 00:47:19,320 I think once you remember that lesson of history, I think 793 00:47:19,320 --> 00:47:23,150 things look less depressing and more promising than they 794 00:47:23,150 --> 00:47:24,810 would look otherwise. 795 00:47:24,810 --> 00:47:26,660 I'm going to stop there. 796 00:47:26,660 --> 00:47:33,460 Thank you and good luck for the final exam and for all the 797 00:47:33,460 --> 00:47:36,770 things you're going to be doing after this. 798 00:47:36,770 --> 00:47:45,535 [APPLAUSE]