1 00:00:00,000 --> 00:00:03,982 The following content is provided by MIT OpenCourseWare 2 00:00:03,982 --> 00:00:05,973 under a Creative Commons license. 3 00:00:05,973 --> 00:00:08,960 Additional information about our license and MIT 4 00:00:08,960 --> 00:00:10,951 OpenCourseWare is available at ocw.MIT.edu. 5 00:00:19,880 --> 00:00:21,830 PROFESSOR: Good afternoon. 6 00:00:26,070 --> 00:00:30,140 For the past couple of lectures, what I've been doing 7 00:00:30,140 --> 00:00:39,080 is using love and romance as the way to talk about broad 8 00:00:39,080 --> 00:00:43,280 issues like evolutionary psychology that could be 9 00:00:43,280 --> 00:00:48,350 talked about with a wide range of other examples. 10 00:00:48,350 --> 00:00:53,050 Love and romance just happen to provide a particularly good 11 00:00:53,050 --> 00:00:56,700 set of examples for that particular topic. 12 00:00:56,700 --> 00:01:02,360 I'm going to do the same thing now with attitude formation 13 00:01:02,360 --> 00:01:07,680 and the links between attitudes and behavior. 14 00:01:07,680 --> 00:01:13,810 But I'm going to switch from the love in relationships 15 00:01:13,810 --> 00:01:18,590 topic, to the topic of racist or prejudicial 16 00:01:18,590 --> 00:01:21,080 behavior and attitudes. 17 00:01:21,080 --> 00:01:24,920 Again, not because that's the only set of attitudes that are 18 00:01:24,920 --> 00:01:30,460 interesting or important, but because it makes for an 19 00:01:30,460 --> 00:01:35,800 interesting path through this material. 20 00:01:35,800 --> 00:01:39,360 So, when you read the book, get the topics discussed and 21 00:01:39,360 --> 00:01:41,190 rather more general terms. 22 00:01:41,190 --> 00:01:46,250 And, here, I'll discuss them in the specific terms of this 23 00:01:46,250 --> 00:01:51,150 particular problem. 24 00:01:51,150 --> 00:01:54,600 I should note at the outset what it says about halfway 25 00:01:54,600 --> 00:01:59,840 down, I see the first page of the handout, which is, I'm 26 00:01:59,840 --> 00:02:05,650 going to do my best to give an explanation about why 27 00:02:05,650 --> 00:02:15,080 prejudiced attitudes are easy to come by and, are readily 28 00:02:15,080 --> 00:02:18,610 comprehensible in terms of psychological processes that 29 00:02:18,610 --> 00:02:20,590 we actually know something about. 30 00:02:20,590 --> 00:02:23,420 But explaining these things is not the same 31 00:02:23,420 --> 00:02:24,670 thing as excusing them. 32 00:02:27,510 --> 00:02:32,110 You can have a society that says, look, all else being 33 00:02:32,110 --> 00:02:35,670 equal, racial prejudice and particularly behaviors based 34 00:02:35,670 --> 00:02:42,080 on prejudices, based on gender race, national origin, 35 00:02:42,080 --> 00:02:47,140 religion, that those sort of biases are bad. 36 00:02:47,140 --> 00:02:50,730 And when they lead to behavior, we want to change 37 00:02:50,730 --> 00:02:51,980 that behavior. 38 00:02:54,390 --> 00:02:57,040 That's quite separate, not unrelated to, but it's 39 00:02:57,040 --> 00:02:59,270 separate from the question of how you would explain it 40 00:02:59,270 --> 00:03:00,310 psychologically. 41 00:03:00,310 --> 00:03:04,110 It's important to remember that explanation is not the 42 00:03:04,110 --> 00:03:06,830 same thing as excuse. 43 00:03:06,830 --> 00:03:09,300 Because I don't want people going out of the lecture 44 00:03:09,300 --> 00:03:13,780 saying, my psych professor explained that it's really, 45 00:03:13,780 --> 00:03:17,360 really easy to develop prejudicial 46 00:03:17,360 --> 00:03:20,650 attitudes, and that's OK. 47 00:03:20,650 --> 00:03:23,860 It's the that's OK part -- 48 00:03:26,820 --> 00:03:30,160 I also put on the handout a pair of quotes from the early 49 00:03:30,160 --> 00:03:36,740 days of the civil rights movement; one from Eisenhower 50 00:03:36,740 --> 00:03:40,020 saying that you can't legislate morality, and a 51 00:03:40,020 --> 00:03:43,330 response from Martin Luther King saying, well maybe not, 52 00:03:43,330 --> 00:03:47,090 but you can legislate the moral behavior. 53 00:03:47,090 --> 00:03:50,940 That's really the social policy point. 54 00:03:50,940 --> 00:03:56,940 If you decide you don't like something that biology, 55 00:03:56,940 --> 00:04:00,210 psychology or whatever is pushing people towards, you 56 00:04:00,210 --> 00:04:04,660 simply have to do something to make it harder for them to go 57 00:04:04,660 --> 00:04:07,780 where that tendency might push them. 58 00:04:07,780 --> 00:04:12,680 Well, in any case, what I'm going to do, is work a story 59 00:04:12,680 --> 00:04:16,530 about the development of prejudicial attitudes that's 60 00:04:16,530 --> 00:04:23,680 got four factors here listed at the top. 61 00:04:23,680 --> 00:04:24,980 And, I work through each of them. 62 00:04:24,980 --> 00:04:30,290 I hope you can see how they're tied together to make 63 00:04:30,290 --> 00:04:35,710 prejudice a very available option to us. 64 00:04:35,710 --> 00:04:40,320 Why this sort of thing happens to us with some regularity. 65 00:04:40,320 --> 00:04:45,330 The first factor that I list there is ethinocentrism. 66 00:04:45,330 --> 00:04:47,430 That's the tendency to think that your 67 00:04:47,430 --> 00:04:50,190 group is the best group. 68 00:04:50,190 --> 00:04:55,200 We're number one kind of thing. 69 00:04:55,200 --> 00:04:59,360 If you want to come up with, say, an evolutionary psych 70 00:04:59,360 --> 00:05:04,240 argument for why this might happen, it's kind of trivial. 71 00:05:04,240 --> 00:05:07,400 If you've got some notion that you want to get your genes 72 00:05:07,400 --> 00:05:11,320 into the next generation, then you might as well favor the 73 00:05:11,320 --> 00:05:15,360 people who are more closely related to you. 74 00:05:15,360 --> 00:05:18,590 So you should be more favorable to 75 00:05:18,590 --> 00:05:23,020 humans than to mice. 76 00:05:23,020 --> 00:05:27,290 You should be likely to be more favorable to people 77 00:05:27,290 --> 00:05:30,730 within your group, and even more so to people within your 78 00:05:30,730 --> 00:05:36,140 family, out of this sort of fairly straightforward 79 00:05:36,140 --> 00:05:40,760 application of evolutionary theory for example. 80 00:05:40,760 --> 00:05:45,920 The more interesting aspect psychologically is how easy it 81 00:05:45,920 --> 00:05:49,670 is to get ethinocentric effects. 82 00:05:49,670 --> 00:05:53,710 So to get the we're number one effect. 83 00:05:53,710 --> 00:05:57,870 Interesting evidence for this comes from these minimal 84 00:05:57,870 --> 00:06:03,190 attachment experiments that called, minimal group 85 00:06:03,190 --> 00:06:05,200 affiliation experiments. 86 00:06:05,200 --> 00:06:05,920 There's a lot of them. 87 00:06:05,920 --> 00:06:10,150 Let me describe a couple to you. 88 00:06:10,150 --> 00:06:11,750 So here's an experiment. 89 00:06:11,750 --> 00:06:15,990 You come into the lab, and we're going to do an 90 00:06:15,990 --> 00:06:21,900 assessment of your taste in abstract art. 91 00:06:21,900 --> 00:06:25,470 And, we're going to show you a bunch of abstract pictures. 92 00:06:25,470 --> 00:06:28,910 And, you're going to say how much you like it on a scale of 93 00:06:28,910 --> 00:06:30,710 one to seven or something. 94 00:06:30,710 --> 00:06:34,400 And, then the feedback you're going to get from this at the 95 00:06:34,400 --> 00:06:43,030 end is that you like the work of Paul Clay, one abstract 96 00:06:43,030 --> 00:06:49,410 artist, better than you like the work of Wassily Kandinsky, 97 00:06:49,410 --> 00:06:51,730 another abstract artist. 98 00:06:51,730 --> 00:06:53,040 Or it might be the other way around. 99 00:06:53,040 --> 00:06:55,040 I don't even remember, by the way, whether they were 100 00:06:55,040 --> 00:06:59,690 actually using real Clay and Kandinsky pictures. 101 00:06:59,690 --> 00:07:02,050 But you're going to get told that you're in the Clay group 102 00:07:02,050 --> 00:07:04,070 or Kandinsky group. 103 00:07:04,070 --> 00:07:10,480 This is not a group assignment where there is a lot at stake. 104 00:07:13,160 --> 00:07:17,870 Let us suppose that you're in the Clay group. 105 00:07:17,870 --> 00:07:19,860 You've been assigned to the Clay group. 106 00:07:19,860 --> 00:07:22,760 In the second part of the experiment, you're playing 107 00:07:22,760 --> 00:07:24,070 some sort of a game. 108 00:07:24,070 --> 00:07:27,890 I don't quite remember what the story was. 109 00:07:27,890 --> 00:07:36,870 And, it ends up with you being able to operate under one of 110 00:07:36,870 --> 00:07:39,960 two pay- off rules. 111 00:07:39,960 --> 00:07:46,220 In one rule, you're going to give the other group one buck, 112 00:07:46,220 --> 00:07:52,500 and your going to get two bucks. 113 00:07:52,500 --> 00:07:54,210 That's one possibility. 114 00:07:54,210 --> 00:07:59,890 The other possibility is a rule that gets you three bucks 115 00:07:59,890 --> 00:08:02,490 and give them four bucks. 116 00:08:02,490 --> 00:08:04,790 So, this is you. 117 00:08:04,790 --> 00:08:08,900 This is them. 118 00:08:08,900 --> 00:08:15,760 You've got a choice between option A, and option B. 119 00:08:15,760 --> 00:08:21,410 Clearly, the rational choice, from your vantage point is 120 00:08:21,410 --> 00:08:23,520 option B, because you get three 121 00:08:23,520 --> 00:08:25,620 bucks, and not two bucks. 122 00:08:25,620 --> 00:08:31,630 But, in fact, there's a bias towards picking this option. 123 00:08:31,630 --> 00:08:32,880 Why is that? 124 00:08:32,880 --> 00:08:36,610 Well, if you get two bucks, you getting more then them. 125 00:08:39,200 --> 00:08:41,580 Here, I'm going to get more than I would have got there, 126 00:08:41,580 --> 00:08:43,670 but these guys are going to get a whole bunch more. 127 00:08:43,670 --> 00:08:45,500 Why should I give those Kandinsky 128 00:08:45,500 --> 00:08:47,720 lovers a bunch of stuff? 129 00:08:54,450 --> 00:08:59,270 Maybe it turns out that after you figure that these 130 00:08:59,270 --> 00:09:03,890 Kandinsky sorts really are a different sort of scummy kind 131 00:09:03,890 --> 00:09:06,180 of person, who you really ought to stick it to. 132 00:09:06,180 --> 00:09:07,950 Well, in fact, there's no difference 133 00:09:07,950 --> 00:09:08,810 between these groups. 134 00:09:08,810 --> 00:09:11,030 The group assignment is completely random. 135 00:09:11,030 --> 00:09:15,090 So any such distinction that you had made in your own mind 136 00:09:15,090 --> 00:09:18,970 is meaningless but maybe you didn't know that. 137 00:09:18,970 --> 00:09:23,220 So we can re-run this whole experiment by saying get rid 138 00:09:23,220 --> 00:09:24,610 of this silly cover story. 139 00:09:24,610 --> 00:09:26,930 We're going to flip a coin. 140 00:09:26,930 --> 00:09:34,750 You're in group B. B. A. A. B. B. B. B. 141 00:09:34,750 --> 00:09:35,930 Look guys, it's random. 142 00:09:35,930 --> 00:09:38,860 There is nothing differentiating your group 143 00:09:38,860 --> 00:09:40,680 from the other group. 144 00:09:40,680 --> 00:09:41,580 Guess what? 145 00:09:41,580 --> 00:09:42,790 You get the same result. 146 00:09:42,790 --> 00:09:45,950 I think it does get a little bit weaker, but not much. 147 00:09:45,950 --> 00:09:53,710 So, just the act of being in a group, causes you to be 148 00:09:53,710 --> 00:09:59,510 inclined to favor that group. 149 00:09:59,510 --> 00:10:04,950 You are also inclined to think that group membership is 150 00:10:04,950 --> 00:10:06,200 diagnostic. 151 00:10:11,230 --> 00:10:13,310 Let me describe a different experiment actually, as the 152 00:10:13,310 --> 00:10:15,280 easiest way to describe this. 153 00:10:15,280 --> 00:10:19,180 Suppose I put up a bunch of dots, and we're above the 154 00:10:19,180 --> 00:10:23,440 subitizing range here, obviously. 155 00:10:23,440 --> 00:10:23,740 OK. 156 00:10:23,740 --> 00:10:24,360 Quickly. 157 00:10:24,360 --> 00:10:27,900 How many dots are they there? 158 00:10:27,900 --> 00:10:28,450 I don't know. 159 00:10:28,450 --> 00:10:31,250 You could guess. 160 00:10:31,250 --> 00:10:33,180 You get lucky, and be right on. 161 00:10:33,180 --> 00:10:35,380 But you'd either be above or below, so you're going to do 162 00:10:35,380 --> 00:10:36,250 this for awhile. 163 00:10:36,250 --> 00:10:38,730 Guess how many dots there are. 164 00:10:38,730 --> 00:10:51,000 And, we are going to declare that you are an overestimater 165 00:10:51,000 --> 00:10:52,830 or an underestimater. 166 00:10:52,830 --> 00:10:58,270 So, that's the group assignment in this case. 167 00:10:58,270 --> 00:11:00,480 Now the interesting thing in this experiment is that you've 168 00:11:00,480 --> 00:11:06,130 done it with another person. 169 00:11:06,130 --> 00:11:20,250 And, the possibilities are that it could be two guys, 170 00:11:20,250 --> 00:11:22,360 both of them are overestimaters. 171 00:11:22,360 --> 00:11:26,700 It could be two guys, one of them an overestimater, one of 172 00:11:26,700 --> 00:11:28,910 them an underestimater. 173 00:11:28,910 --> 00:11:31,050 Similarly for females right? 174 00:11:33,730 --> 00:11:37,540 They could both be underestimaters and the 175 00:11:37,540 --> 00:11:39,000 critical conditions are. 176 00:11:39,000 --> 00:11:44,120 It could be a male, and a, female who, are both 177 00:11:44,120 --> 00:11:47,190 overestimaters both labelled as overestimaters or both 178 00:11:47,190 --> 00:11:48,610 labelled as underestimaters. 179 00:11:48,610 --> 00:11:56,530 And the critical condition is male overestimater, female 180 00:11:56,530 --> 00:11:58,340 underestimater or the other way around. 181 00:11:58,340 --> 00:12:04,170 But the important point here is that the male is identified 182 00:12:04,170 --> 00:12:08,550 as one, the female is identified as the other. 183 00:12:08,550 --> 00:12:10,660 So you've got people in all these various groups. 184 00:12:10,660 --> 00:12:11,080 I don't know. 185 00:12:11,080 --> 00:12:13,360 Somebody can figure out how many groups there must be for 186 00:12:13,360 --> 00:12:14,420 the full design here. 187 00:12:14,420 --> 00:12:16,930 Got people in all those groups. 188 00:12:16,930 --> 00:12:19,390 And, now you ask a question, at the end of the whole 189 00:12:19,390 --> 00:12:22,530 experiment, do you think there's a sex difference, a 190 00:12:22,530 --> 00:12:27,210 systematic sex difference between men and 191 00:12:27,210 --> 00:12:30,700 women on this task? 192 00:12:30,700 --> 00:12:32,840 What do you think the sex difference is, if there's a 193 00:12:32,840 --> 00:12:35,240 sex difference? 194 00:12:35,240 --> 00:12:41,820 All these groups, all the same sex groups, on average, I 195 00:12:41,820 --> 00:12:42,820 don't think there's a difference. 196 00:12:42,820 --> 00:12:44,490 Some people say, I think women are better. 197 00:12:44,490 --> 00:12:48,240 Some people think men are overestimaters or whatever. 198 00:12:48,240 --> 00:12:53,020 But, there's no systematic bias here. 199 00:12:53,020 --> 00:12:55,340 Nothing systematic happens here. 200 00:12:55,340 --> 00:12:58,380 Here something systematic happens, in this particular 201 00:12:58,380 --> 00:12:59,630 version of it. 202 00:13:06,630 --> 00:13:11,330 You would declare that males as a group are overestimaters 203 00:13:11,330 --> 00:13:14,630 and females has a group are underestimaters. 204 00:13:14,630 --> 00:13:16,880 What's the evidence for that? 205 00:13:16,880 --> 00:13:19,820 One of each. 206 00:13:19,820 --> 00:13:22,580 Clearly not meaningful. 207 00:13:22,580 --> 00:13:25,810 There's no statistical reliability that you could 208 00:13:25,810 --> 00:13:26,670 glean from this. 209 00:13:26,670 --> 00:13:34,200 It could be that brown haired people or blond haired people 210 00:13:34,200 --> 00:13:36,560 or something are different. 211 00:13:36,560 --> 00:13:39,980 But you know that there's a group difference here, because 212 00:13:39,980 --> 00:13:43,000 males and females are definitely different groups. 213 00:13:43,000 --> 00:13:45,190 As soon as you've got evidence that there's a difference 214 00:13:45,190 --> 00:13:48,710 across this group, you are willing to start making 215 00:13:48,710 --> 00:13:51,040 assumptions that difference applies to 216 00:13:51,040 --> 00:13:54,190 the group as a whole. 217 00:13:54,190 --> 00:13:57,360 You see how that works? 218 00:13:57,360 --> 00:13:58,380 More or less? 219 00:13:58,380 --> 00:13:59,350 Somebody's nodding their head. 220 00:13:59,350 --> 00:14:00,080 That's encouraging. 221 00:14:00,080 --> 00:14:02,080 Ok. 222 00:14:02,080 --> 00:14:07,060 So that's factor one-- 223 00:14:11,440 --> 00:14:14,950 that you're inclined to see your group as number one. 224 00:14:14,950 --> 00:14:18,580 And this is a point we'll come back to, which is that you're 225 00:14:18,580 --> 00:14:22,100 inclined to see groups as having properties that pertain 226 00:14:22,100 --> 00:14:23,490 to that group. 227 00:14:23,490 --> 00:14:26,300 And, you'll jump to that quickly. 228 00:14:26,300 --> 00:14:35,640 Now the tendency to think that group identification tells you 229 00:14:35,640 --> 00:14:39,480 something about properties of the group, that's known as 230 00:14:39,480 --> 00:14:40,560 stereotyping. 231 00:14:40,560 --> 00:14:44,050 If I know that you are part of this group, I believe I know 232 00:14:44,050 --> 00:14:46,510 something about you. 233 00:14:46,510 --> 00:14:47,760 Period. 234 00:14:52,360 --> 00:14:54,340 When we talk about stereotyping, we tend to think 235 00:14:54,340 --> 00:14:56,680 of it in negative terms. 236 00:14:56,680 --> 00:15:02,700 That is a fairly self evident consequence of factor one 237 00:15:02,700 --> 00:15:04,210 mixed with factor two. 238 00:15:04,210 --> 00:15:09,480 If you are inclined to think that your group is number one, 239 00:15:09,480 --> 00:15:14,200 and you're inclined to think that group identity tells you 240 00:15:14,200 --> 00:15:19,140 something, it follows that being a member of another 241 00:15:19,140 --> 00:15:23,470 group, means you're of a group that is, at best number two. 242 00:15:23,470 --> 00:15:25,470 Ain't number one, that's my group. 243 00:15:25,470 --> 00:15:30,180 So you in this group that I've identified, are in the some 244 00:15:30,180 --> 00:15:34,570 other lesser group on this scale. 245 00:15:34,570 --> 00:15:35,800 That is fairly obvious. 246 00:15:35,800 --> 00:15:38,010 What's a little less obvious, and is, at least worth 247 00:15:38,010 --> 00:15:43,300 mentioning, is that stereotypes are not just 248 00:15:43,300 --> 00:15:47,710 descriptions of things that are common that to that 249 00:15:47,710 --> 00:15:49,000 population. 250 00:15:49,000 --> 00:15:50,990 Here's the silly example. 251 00:15:50,990 --> 00:15:57,680 Let us consider the Asian women stereotype. 252 00:15:57,680 --> 00:16:03,150 Is it part of the Asian woman, is bipedality the Asian woman 253 00:16:03,150 --> 00:16:06,390 stereotype? 254 00:16:06,390 --> 00:16:07,180 No. 255 00:16:07,180 --> 00:16:09,610 Nobody sits around and says all those Asian 256 00:16:09,610 --> 00:16:12,300 women have two feet. 257 00:16:12,300 --> 00:16:13,290 That's stupid. 258 00:16:13,290 --> 00:16:15,630 Nobody's going to say anything like that, because everybody's 259 00:16:15,630 --> 00:16:17,080 got two feet. 260 00:16:17,080 --> 00:16:22,620 What's important in a stereotype is, stereotypes are 261 00:16:22,620 --> 00:16:24,610 different scores in the sense. 262 00:16:24,610 --> 00:16:27,530 It's not necessarily accurate ones you understand. 263 00:16:27,530 --> 00:16:28,880 They can be completely bogus. 264 00:16:28,880 --> 00:16:33,780 But what defines a stereotype, is what somebody thinks 265 00:16:33,780 --> 00:16:40,950 differentiates one group from the population as a whole. 266 00:16:40,950 --> 00:16:48,450 So I put some data down here from one big study of 267 00:16:48,450 --> 00:16:51,380 stereotypes. 268 00:16:51,380 --> 00:16:54,340 Please note that these are not facts. 269 00:16:54,340 --> 00:16:56,250 I mean they are facts in the sense that they are data. 270 00:16:56,250 --> 00:16:58,940 They're not true facts about, in this case, the German 271 00:16:58,940 --> 00:16:59,610 population. 272 00:16:59,610 --> 00:17:05,830 What they are is what this particular group of subjects 273 00:17:05,830 --> 00:17:08,460 reported believing about different groups. 274 00:17:08,460 --> 00:17:10,740 In this case, the Germans. 275 00:17:10,740 --> 00:17:16,300 I excerpted it from a huge study. 276 00:17:16,300 --> 00:17:18,880 The factors, efficient, extremely nationalistic, 277 00:17:18,880 --> 00:17:21,690 scientific- minded, and pleasure- loving. 278 00:17:21,690 --> 00:17:28,040 You will note that the largest single category, the highest 279 00:17:28,040 --> 00:17:32,190 value for the German population, in this collection 280 00:17:32,190 --> 00:17:38,290 of data points, is pleasure- loving, but that's not part of 281 00:17:38,290 --> 00:17:41,590 what would be considered the stereotypes here, because it's 282 00:17:41,590 --> 00:17:45,770 not higher then the population as a whole. 283 00:17:45,770 --> 00:17:49,420 This particular group of subjects asserted that 82% of 284 00:17:49,420 --> 00:17:53,580 people in the world would be pleasure- loving, and a mere 285 00:17:53,580 --> 00:17:55,820 72% of the Germans would be. 286 00:17:55,820 --> 00:17:58,430 Therefore, pleasure- lovingness would not be 287 00:17:58,430 --> 00:18:02,710 considered part of that stereotype because it it's not 288 00:18:02,710 --> 00:18:11,050 something that distinguishes this view of Germans from the 289 00:18:11,050 --> 00:18:14,530 view of people as a whole. 290 00:18:14,530 --> 00:18:17,160 And, so, efficient, yes. 291 00:18:17,160 --> 00:18:18,910 Extremely nationalistic yes. 292 00:18:18,910 --> 00:18:22,040 And interesting is something like scientifically- minded 293 00:18:22,040 --> 00:18:25,200 would be considered part of the stereotype, even though 294 00:18:25,200 --> 00:18:27,000 it's not even held to be a majority. 295 00:18:29,570 --> 00:18:32,020 This perception doesn't say that a majority of Germans are 296 00:18:32,020 --> 00:18:34,870 scientifically- minded, but more Germans are 297 00:18:34,870 --> 00:18:40,290 scientifically- minded according to this view then 298 00:18:40,290 --> 00:18:41,800 the population as a whole. 299 00:18:41,800 --> 00:18:44,630 Again, I have no idea what the true data would be for the 300 00:18:44,630 --> 00:18:45,660 German population. 301 00:18:45,660 --> 00:18:48,350 I don't know if they think about science at all. 302 00:18:48,350 --> 00:18:54,100 But, the perception is that the stereotype would include 303 00:18:54,100 --> 00:18:57,450 efficient, nationalistic, and scientific, and not pleasure- 304 00:18:57,450 --> 00:19:03,820 minded, because of this issue of a differential relationship 305 00:19:03,820 --> 00:19:06,630 to the perceived baseline. 306 00:19:06,630 --> 00:19:11,470 To the population as a whole. 307 00:19:11,470 --> 00:19:16,120 One of the factors contributing to the power of 308 00:19:16,120 --> 00:19:19,840 stereotyping is, what's known as the out group 309 00:19:19,840 --> 00:19:21,670 homogeneaity effect. 310 00:19:21,670 --> 00:19:24,540 So, the in group is you. 311 00:19:24,540 --> 00:19:26,130 You're in some group. 312 00:19:26,130 --> 00:19:29,160 The out group are other people. 313 00:19:29,160 --> 00:19:33,870 The out group homogeneaity effect is the tendency to 314 00:19:33,870 --> 00:19:40,790 think that all of them are kind of alike. 315 00:19:40,790 --> 00:19:43,080 You don't think that about you in the same way. 316 00:19:46,950 --> 00:19:50,260 From the last lecture, the nerdy high 317 00:19:50,260 --> 00:19:53,730 school freshman group. 318 00:19:53,730 --> 00:19:57,430 If that was my in group, I would know that they aren't 319 00:19:57,430 --> 00:20:00,590 all alike, because I'm one of them, and I know that I'm 320 00:20:00,590 --> 00:20:05,260 different from all those other nerdy high school freshman. 321 00:20:05,260 --> 00:20:10,780 But, the jocks, man, they're all alike. 322 00:20:10,780 --> 00:20:14,090 It's an effect, an ignorance effect. 323 00:20:14,090 --> 00:20:19,290 Now, how does this play into bias? 324 00:20:19,290 --> 00:20:24,230 How does this ignorance factor play into not just thinking 325 00:20:24,230 --> 00:20:27,610 that all those people are alike, but the thinking less 326 00:20:27,610 --> 00:20:28,640 of those people? 327 00:20:28,640 --> 00:20:34,900 If you ask about what you know about groups that you don't 328 00:20:34,900 --> 00:20:39,300 interact with much, where do you got your 329 00:20:39,300 --> 00:20:41,370 information about them? 330 00:20:41,370 --> 00:20:43,390 You get your information from the news. 331 00:20:43,390 --> 00:20:46,340 What gets you on to the news? 332 00:20:46,340 --> 00:20:49,780 The fact that you're a good student, and 333 00:20:49,780 --> 00:20:51,940 you love your mother? 334 00:20:51,940 --> 00:20:53,580 No. 335 00:20:53,580 --> 00:20:58,670 But, if you decide to go off and commit an armed robbery, 336 00:20:58,670 --> 00:21:01,780 or something like that, that might get you on the news. 337 00:21:01,780 --> 00:21:06,160 And, if you are a member of a distinctive group, of some 338 00:21:06,160 --> 00:21:10,450 sort, that's not my group, I'm going to say, hey, look, I've 339 00:21:10,450 --> 00:21:14,020 got a data point about, -- we can keep picking on 340 00:21:14,020 --> 00:21:15,370 the Asian women -- 341 00:21:15,370 --> 00:21:18,210 I've got a data point about Asian women. 342 00:21:18,210 --> 00:21:21,370 That one committed an armed robbery. 343 00:21:21,370 --> 00:21:22,970 I know about Asian women now. 344 00:21:22,970 --> 00:21:24,390 They all commit armed robbery. 345 00:21:24,390 --> 00:21:27,090 Well, you don't do anything quite that bald and stupid. 346 00:21:27,090 --> 00:21:31,080 But that's the sense in which you are willing to color the 347 00:21:31,080 --> 00:21:33,790 entire group on the basis of whatever information you have 348 00:21:33,790 --> 00:21:35,320 about one member of it. 349 00:21:35,320 --> 00:21:39,050 That's another version of this effect. 350 00:21:39,050 --> 00:21:44,210 Is likely to lead to negative assessments of the out group, 351 00:21:44,210 --> 00:21:46,510 because the information you get about groups that you 352 00:21:46,510 --> 00:21:50,140 don't interact with is skewed towards the negative. 353 00:21:50,140 --> 00:21:53,100 The stuff that's going to make the news about a group is 354 00:21:53,100 --> 00:21:57,930 going to be typically the negative effect. 355 00:22:02,140 --> 00:22:06,330 So where are the strongest stereotypes in the American 356 00:22:06,330 --> 00:22:11,650 population -- well actually this is a somewhat old study 357 00:22:11,650 --> 00:22:17,950 at this point -- but when you assess, you can use various 358 00:22:17,950 --> 00:22:23,850 questionnaire assessments to assess how strongly a 359 00:22:23,850 --> 00:22:28,210 population holds stereotypic views of another population. 360 00:22:28,210 --> 00:22:33,045 The heavily stereotyped views held by an American population 361 00:22:33,045 --> 00:22:37,970 -- this is now about 15 years ago-- were held of Turks, 362 00:22:37,970 --> 00:22:43,480 Arabs to a lesser degree of the Japanese, groups that were 363 00:22:43,480 --> 00:22:50,460 not heavily represented in the general Americans mix. 364 00:22:50,460 --> 00:22:54,650 Less heavy stereotypes for groups with large immigrant 365 00:22:54,650 --> 00:22:58,590 populations in this country, because you were more likely 366 00:22:58,590 --> 00:23:03,250 to know some people in that group, and that makes the 367 00:23:03,250 --> 00:23:10,050 stereotypes less firm, less strong. 368 00:23:10,050 --> 00:23:18,060 One of the reasons that these stereotypes matter is because 369 00:23:18,060 --> 00:23:23,510 along with being willing to build them quickly and easily, 370 00:23:23,510 --> 00:23:31,050 we are also inclined to think that the attributes that we 371 00:23:31,050 --> 00:23:36,150 put on other groups are causal. 372 00:23:36,150 --> 00:23:37,040 At least in others. 373 00:23:37,040 --> 00:23:39,690 This is what's known as the fundamental attribution error. 374 00:23:39,690 --> 00:23:41,760 Let me explain a little. 375 00:23:41,760 --> 00:23:48,480 But I sent a note to a couple of my social psych colleagues 376 00:23:48,480 --> 00:23:51,630 yesterday, as I was thinking about this lecture, asking who 377 00:23:51,630 --> 00:23:55,450 was it who named the fundamental attribution error, 378 00:23:55,450 --> 00:23:58,320 because that's a great thing to be able to do. 379 00:23:58,320 --> 00:24:03,410 To be able to call what you work on fundamental. 380 00:24:03,410 --> 00:24:07,380 And, it turns out to be Nisbett, N I S B E T T, from 381 00:24:07,380 --> 00:24:10,720 the University of Michigan if you want to track that down. 382 00:24:10,720 --> 00:24:15,060 But, in any case, let me explain what this is. 383 00:24:15,060 --> 00:24:22,110 There are two broad ways of thinking about personality. 384 00:24:22,110 --> 00:24:26,960 We all have some notion that we've got a personality. 385 00:24:26,960 --> 00:24:31,070 And, thinking about the attributes that make up that 386 00:24:31,070 --> 00:24:34,880 personality can be divided into two broad categories that 387 00:24:34,880 --> 00:24:37,900 map onto the usual nature/ nurture kind of 388 00:24:37,900 --> 00:24:39,850 arguments in the field. 389 00:24:39,850 --> 00:24:48,610 There are trait theories that are typically on the more 390 00:24:48,610 --> 00:24:50,850 nature side of it. 391 00:24:50,850 --> 00:24:56,280 There are fundamental attributes of personality 392 00:24:56,280 --> 00:24:59,460 maybe coming from a genetic origin. 393 00:25:03,190 --> 00:25:06,830 You are who you are because you have these traits. 394 00:25:06,830 --> 00:25:11,180 The alternative from the more nurtured side of things, the 395 00:25:11,180 --> 00:25:24,050 environmental side of it is, is a situationalist account 396 00:25:24,050 --> 00:25:29,320 that says you are who you are because of where you are. 397 00:25:29,320 --> 00:25:37,650 Trait theory, you are here now because you where born smart, 398 00:25:37,650 --> 00:25:42,280 and hard working, and studious, or you basically 399 00:25:42,280 --> 00:25:47,000 have consistent over time, hardworking, studious 400 00:25:47,000 --> 00:25:49,610 attributes to you. 401 00:25:49,610 --> 00:25:52,870 The situationlist account says you're sitting here right now 402 00:25:52,870 --> 00:25:57,820 emitting student behavior because you're in a student 403 00:25:57,820 --> 00:25:59,250 kind of environment. 404 00:25:59,250 --> 00:26:06,470 If we put you on a farm, you would not be sitting there in 405 00:26:06,470 --> 00:26:08,480 the middle of the field with a notebook taking 406 00:26:08,480 --> 00:26:09,294 notes about the cow. 407 00:26:09,294 --> 00:26:11,720 That's not what you'd be doing. 408 00:26:11,720 --> 00:26:15,270 In that situation, you'd be doing farm kind of stuff. 409 00:26:15,270 --> 00:26:19,870 Like all such debates in the field, the truth is going to 410 00:26:19,870 --> 00:26:22,400 lie somewhere in between. 411 00:26:22,400 --> 00:26:23,910 There's going to be bits of both. 412 00:26:23,910 --> 00:26:25,900 You're not going to get any mileage out of arguing 413 00:26:25,900 --> 00:26:28,790 strictly one or strictly the other. 414 00:26:28,790 --> 00:26:30,920 What you think about this, what you think about the 415 00:26:30,920 --> 00:26:35,390 balance here, is important for the policy 416 00:26:35,390 --> 00:26:36,640 purposes, for instance. 417 00:26:36,640 --> 00:26:41,800 Why did this guy commit a crime? 418 00:26:41,800 --> 00:26:44,890 We know he committed a crime, because we just convicted him 419 00:26:44,890 --> 00:26:47,500 of committing this crime. 420 00:26:47,500 --> 00:26:49,460 Why did he commit a crime? 421 00:26:49,460 --> 00:26:55,310 Is it because he's a criminal sort of person? 422 00:26:55,310 --> 00:26:58,650 Fundamentally, dishonest, nasty, kind of person. 423 00:26:58,650 --> 00:27:03,970 Or, is it because he was in a situation that promoted 424 00:27:03,970 --> 00:27:06,000 criminal behavior? 425 00:27:06,000 --> 00:27:07,080 Why does that matter? 426 00:27:07,080 --> 00:27:12,420 Well if you're in this mode, you may send him off to prison 427 00:27:12,420 --> 00:27:13,550 in both cases. 428 00:27:13,550 --> 00:27:16,750 In this case, you're likely to think of prison as a place 429 00:27:16,750 --> 00:27:20,650 where you put bad people to keep them out of the way for a 430 00:27:20,650 --> 00:27:22,970 length of time that's appropriate to whatever bad 431 00:27:22,970 --> 00:27:24,670 thing they did. 432 00:27:24,670 --> 00:27:27,780 But, it's basically as a punishment. 433 00:27:27,780 --> 00:27:31,260 This is the personality theory, that would lead you to 434 00:27:31,260 --> 00:27:34,670 call your prison a correctional institution. 435 00:27:34,670 --> 00:27:38,960 Because you think that you could correct this person. 436 00:27:38,960 --> 00:27:43,470 If you think he's fundamentally a bad person, 437 00:27:43,470 --> 00:27:46,460 the trick is to make sure he can't do that anymore. 438 00:27:46,460 --> 00:27:49,970 If you think it's because of the situation, that somehow 439 00:27:49,970 --> 00:27:53,060 forced to him into or pushed him into criminal behavior, 440 00:27:53,060 --> 00:27:54,310 you want to fix that. 441 00:27:58,800 --> 00:28:04,070 Modern prison philosophy is neither all the way here, or 442 00:28:04,070 --> 00:28:05,390 all the way there. 443 00:28:05,390 --> 00:28:10,250 But, the balance is really a personality theory question. 444 00:28:10,250 --> 00:28:16,820 The fundamental attribution error is a tendency to hold to 445 00:28:16,820 --> 00:28:20,600 a more trait theoretic position when you're talking 446 00:28:20,600 --> 00:28:24,910 about other people than when you're talking about yourself. 447 00:28:24,910 --> 00:28:25,930 Why is it in error? 448 00:28:25,930 --> 00:28:32,090 Well it logically can't be the case that you are the product 449 00:28:32,090 --> 00:28:36,250 largely of the situation and that they are a product of 450 00:28:36,250 --> 00:28:39,390 their invariant parts. 451 00:28:39,390 --> 00:28:42,850 That over the population as a whole isn't going to isn't 452 00:28:42,850 --> 00:28:44,100 going to hold up. 453 00:28:50,490 --> 00:28:54,230 Why did this guy rob the bank? 454 00:28:54,230 --> 00:28:57,720 He robbed the bank, because he's a criminal. 455 00:28:57,720 --> 00:28:59,200 Why did I rob the bank? 456 00:28:59,200 --> 00:29:02,930 I robbed the bank, because I was hungry, and the door was 457 00:29:02,930 --> 00:29:05,830 unlocked, and I didn't really rob the bank, I just kind of 458 00:29:05,830 --> 00:29:08,770 picked up the money that was lying on the floor, and 459 00:29:08,770 --> 00:29:11,830 anyway, it was other guy. 460 00:29:11,830 --> 00:29:16,070 Much more likely to give us situational account. 461 00:29:16,070 --> 00:29:21,610 Why did you get a bad grade on the midterms. 462 00:29:21,610 --> 00:29:23,030 Suppose you got a bad grade on the midterm. 463 00:29:23,030 --> 00:29:24,510 Why did you get a bad grade on the midterm. 464 00:29:24,510 --> 00:29:27,330 Well, the story was really lame, and it distracted me. 465 00:29:27,330 --> 00:29:28,880 And I didn't get enough sleep. 466 00:29:28,880 --> 00:29:34,080 And the course wasn't a big priority for me. 467 00:29:34,080 --> 00:29:38,760 That's why I got it bad grade. 468 00:29:38,760 --> 00:29:42,720 Your TA says-- well your TA is a good person, and doesn't say 469 00:29:42,720 --> 00:29:45,940 this-- but your TA looks at the exam and says, why did 470 00:29:45,940 --> 00:29:47,510 they get a bad grade on the exam? 471 00:29:47,510 --> 00:29:49,300 They're stupid! 472 00:29:49,300 --> 00:29:52,900 That would be the fundamental attribution error. 473 00:29:52,900 --> 00:29:56,520 Why did your TA get a bad grade? 474 00:29:56,520 --> 00:29:59,170 I didn't get enough sleep and stuff like that. 475 00:29:59,170 --> 00:30:02,170 We're more inclined to give situationalist accounts of our 476 00:30:02,170 --> 00:30:07,890 own behavior and more inclined to get trait accounts of other 477 00:30:07,890 --> 00:30:08,780 people's behavior. 478 00:30:08,780 --> 00:30:12,240 All right, so let's see where this has gotten us to. 479 00:30:12,240 --> 00:30:14,700 We're inclined to put people into groups. 480 00:30:14,700 --> 00:30:20,230 We're inclined to assign attributes to those groups. 481 00:30:20,230 --> 00:30:23,980 We're likely to assign more negative attributes to groups 482 00:30:23,980 --> 00:30:26,970 than we ought to because our information about groups that 483 00:30:26,970 --> 00:30:31,540 we don't know is being skewed in that direction, and we're 484 00:30:31,540 --> 00:30:36,390 likely to attribute behavior to what we now perceive 485 00:30:36,390 --> 00:30:39,570 correctly or incorrectly as the traits of the group. 486 00:30:39,570 --> 00:30:43,800 And, so you can see how you're going to end up with a story 487 00:30:43,800 --> 00:30:47,460 that is a pretty negative account of some 488 00:30:47,460 --> 00:30:49,600 out group or other. 489 00:30:49,600 --> 00:30:53,080 Oh by the way, there's a very interesting wrinkle on the 490 00:30:53,080 --> 00:30:56,050 fundamental attribution error, or at least there's certainly 491 00:30:56,050 --> 00:30:58,960 used to be, I do not know whether this is still the case 492 00:30:58,960 --> 00:31:02,220 in the population as a whole, and I certainly don't know, 493 00:31:02,220 --> 00:31:04,950 though I would be very interested to know, whether 494 00:31:04,950 --> 00:31:07,370 it's true at MIT. 495 00:31:10,130 --> 00:31:13,030 Let's try this as an experiment, and see how your 496 00:31:13,030 --> 00:31:15,870 intuition goes -- 497 00:31:15,870 --> 00:31:20,200 OK we can do the trait versus situational thing, and we're 498 00:31:20,200 --> 00:31:29,220 going to have a math test. 499 00:31:36,190 --> 00:31:39,930 You're interpreting your own score on a math test. 500 00:31:39,930 --> 00:31:49,090 That score can be good or bad, as you may know. 501 00:31:49,090 --> 00:31:56,150 There are explanations of why the test for was good. 502 00:31:56,150 --> 00:32:00,810 A trait theory would be, I'm a genius. 503 00:32:00,810 --> 00:32:04,140 A situationalist theory would be, I'm lucky. 504 00:32:12,800 --> 00:32:20,320 And, if the test is bad, you can have an assessment that it 505 00:32:20,320 --> 00:32:30,050 says I'm dumb, or you going to have an assessment that said, 506 00:32:30,050 --> 00:32:33,150 that the test is unfair. 507 00:32:33,150 --> 00:32:38,360 The interesting bit is that one gender is more 508 00:32:38,360 --> 00:32:39,890 likely to say this. 509 00:32:39,890 --> 00:32:42,990 And the other gender is more likely to say this. 510 00:32:42,990 --> 00:32:45,790 One gender is more likely to say this. 511 00:32:45,790 --> 00:32:48,250 And the other gender's more likely to say that. 512 00:32:48,250 --> 00:32:51,270 So each of these cells can be associated with a gender. 513 00:32:51,270 --> 00:32:53,930 So I got a good grade on the test. 514 00:32:53,930 --> 00:32:54,710 I'm a genius. 515 00:32:54,710 --> 00:32:56,020 Who are we talking about? 516 00:32:56,020 --> 00:32:57,270 AUDIENCE: Men. 517 00:32:57,270 --> 00:33:00,820 PROFESSOR: All right. 518 00:33:00,820 --> 00:33:06,760 And, it follows that this must be the female cell. 519 00:33:06,760 --> 00:33:10,480 I got a bad grade on the test. 520 00:33:10,480 --> 00:33:11,360 I'm dumb. 521 00:33:11,360 --> 00:33:13,390 AUDIENCE: Female. 522 00:33:13,390 --> 00:33:16,590 PROFESSOR: And, that is the historical finding. 523 00:33:16,590 --> 00:33:19,390 These are studies that came out in the early days of the 524 00:33:19,390 --> 00:33:22,060 women's movement. 525 00:33:22,060 --> 00:33:23,580 And, I don't know about whether or 526 00:33:23,580 --> 00:33:27,170 not it's still pertains. 527 00:33:27,170 --> 00:33:35,270 But the disturbing finding at the time was that males were 528 00:33:35,270 --> 00:33:40,010 inclined to give trait theoretic answers for the good 529 00:33:40,010 --> 00:33:46,260 stuff, I'm good, I'm bright, I'm gorgeous. 530 00:33:46,260 --> 00:33:51,640 And, females were likely to say I'm lucky and it's all 531 00:33:51,640 --> 00:33:53,760 makeup, or something like that. 532 00:33:53,760 --> 00:33:57,200 And on the other side, on the bad side, the females were 533 00:33:57,200 --> 00:34:00,990 likely to give trait theoretic answers. 534 00:34:00,990 --> 00:34:04,900 I'm dumb and ugly and depressed and its terrible. 535 00:34:04,900 --> 00:34:09,520 And the guys were likely to say I'm brilliant, I'm 536 00:34:09,520 --> 00:34:13,080 gorgeous, et cetera, and the test was really kind unfair 537 00:34:13,080 --> 00:34:19,250 and my teacher hated me and stuff. 538 00:34:19,250 --> 00:34:21,050 It would be interesting to know whether 539 00:34:21,050 --> 00:34:24,850 that was still true. 540 00:34:24,850 --> 00:34:25,980 We might as well take a poll. 541 00:34:25,980 --> 00:34:28,510 How many people think that if we actually collected data, 542 00:34:28,510 --> 00:34:32,890 we'd find that something like that was still true? 543 00:34:32,890 --> 00:34:35,530 How many think that we would find it has gone away? 544 00:34:39,840 --> 00:34:45,590 I have no idea if there's new data on that. 545 00:34:45,590 --> 00:34:52,150 The basic point is that with that modulation possibly we 546 00:34:52,150 --> 00:34:54,370 tend to see situational explanations of our own 547 00:34:54,370 --> 00:34:57,570 behavior, and trait explanations of 548 00:34:57,570 --> 00:34:59,440 other people's behavior. 549 00:34:59,440 --> 00:35:05,530 Now let's take a look at this last factor, which I'm calling 550 00:35:05,530 --> 00:35:14,160 the role of the ignorance in person perception. 551 00:35:14,160 --> 00:35:26,330 And see how that can lead to what looks like a biased 552 00:35:26,330 --> 00:35:30,280 outcome, perhaps even if you didn't 553 00:35:30,280 --> 00:35:33,970 have these other factors. 554 00:35:33,970 --> 00:35:36,600 This ignorance factor's now going to interact with these 555 00:35:36,600 --> 00:35:42,340 other factors to make biased outcomes. 556 00:35:42,340 --> 00:35:45,470 It's quite easy to come by. 557 00:35:45,470 --> 00:35:51,980 So, let's do a version of the classic physics joke, about 558 00:35:51,980 --> 00:35:53,650 assume the horse is a sphere. 559 00:35:57,210 --> 00:36:00,160 We're going to over simplify the situation. 560 00:36:00,160 --> 00:36:11,640 The issue here is who are you going to be friends with? 561 00:36:11,640 --> 00:36:15,720 Well, first of all, we have to go back to the good, ernest 562 00:36:15,720 --> 00:36:19,880 high school discussion about making friends with people. 563 00:36:22,440 --> 00:36:24,440 And, does it matter if they're wearing the 564 00:36:24,440 --> 00:36:26,690 latest designer whatever? 565 00:36:26,690 --> 00:36:30,590 And, the answer, of course, is no, because you shouldn't 566 00:36:30,590 --> 00:36:34,120 judge a book by its cover. 567 00:36:34,120 --> 00:36:34,620 Good. 568 00:36:34,620 --> 00:36:36,380 Nice cliche. 569 00:36:36,380 --> 00:36:38,110 And, it is, of course, true. 570 00:36:41,200 --> 00:36:43,890 In the first assume the horse is a sphere over 571 00:36:43,890 --> 00:36:48,590 simplification, the problem, let us assume that the set of 572 00:36:48,590 --> 00:36:51,240 people with whom you might be friends in the world, is the 573 00:36:51,240 --> 00:36:56,480 set of let's just make it all MIT undergraduates. 574 00:36:56,480 --> 00:37:00,810 And we know we don't want to judge books by covers, so, 575 00:37:00,810 --> 00:37:05,790 therefore, what you're going to do is, set up in depth 576 00:37:05,790 --> 00:37:08,780 interviews with everybody. 577 00:37:08,780 --> 00:37:10,570 And decide who's going to be your friend on 578 00:37:10,570 --> 00:37:12,790 the basis of that. 579 00:37:12,790 --> 00:37:13,940 No. 580 00:37:13,940 --> 00:37:15,830 That's not going to work. 581 00:37:15,830 --> 00:37:18,890 Well, all right, the other alternative is, don't want to 582 00:37:18,890 --> 00:37:22,170 judge books by their cover, therefore I won't talk to 583 00:37:22,170 --> 00:37:23,520 anybody ever again. 584 00:37:23,520 --> 00:37:24,920 That's not going to work either. 585 00:37:24,920 --> 00:37:29,810 So it is self- evident again, various bits of this are self- 586 00:37:29,810 --> 00:37:33,830 evident, you've gotta make snap decisions on the basis of 587 00:37:33,830 --> 00:37:35,380 imperfect information. 588 00:37:35,380 --> 00:37:37,620 It doesn't mean that you have to make your decisions on the 589 00:37:37,620 --> 00:37:43,050 basis of whether or not they're wearing designer 590 00:37:43,050 --> 00:37:44,370 whatevers, of course. 591 00:37:44,370 --> 00:37:48,000 But you're necessarily going to have to make a first cut 592 00:37:48,000 --> 00:37:52,170 through the population on the basis of essentially 593 00:37:52,170 --> 00:37:54,130 superficial information. 594 00:37:54,130 --> 00:37:55,970 Well, what's that going to do? 595 00:37:55,970 --> 00:38:08,440 Let us assume that in the world, in an act of massive, 596 00:38:08,440 --> 00:38:15,610 further over simplification, there are bad people and are 597 00:38:15,610 --> 00:38:17,700 good people. 598 00:38:17,700 --> 00:38:21,790 And, your job is to divide the world into those two 599 00:38:21,790 --> 00:38:22,480 categories. 600 00:38:22,480 --> 00:38:24,710 So you're going to make an assessment. 601 00:38:27,360 --> 00:38:31,140 You're going to perform an act of person perception, and 602 00:38:31,140 --> 00:38:41,300 divide the world into bad people, and good people. 603 00:38:41,300 --> 00:38:44,450 We've got a nice simple two buy two design here. 604 00:38:44,450 --> 00:38:49,550 Ideally you want everybody to be in those two populations, 605 00:38:49,550 --> 00:38:50,990 and in those two cells. 606 00:38:50,990 --> 00:38:55,760 Even if you could do in depth interviews with everybody, 607 00:38:55,760 --> 00:38:58,880 it's not clear you'd never make a mistake, but clearly, 608 00:38:58,880 --> 00:39:01,250 if you're just going to be basing your decisions on 609 00:39:01,250 --> 00:39:04,570 relatively superficial information, there are going 610 00:39:04,570 --> 00:39:06,910 to be errors. 611 00:39:06,910 --> 00:39:09,450 I labeled these Type 1 and Type 2, which is actually 612 00:39:09,450 --> 00:39:12,340 jargon from signal detection land, but 613 00:39:12,340 --> 00:39:14,460 don't worry about that. 614 00:39:14,460 --> 00:39:22,950 It just, in this case, give us a chance to ask the question. 615 00:39:22,950 --> 00:39:27,720 Which of these type of errors is worse? 616 00:39:27,720 --> 00:39:34,850 If you had a choice about which error to make, how many 617 00:39:34,850 --> 00:39:40,760 people, given the choice between, let's be clear about 618 00:39:40,760 --> 00:39:43,920 this, you've got a good person, and you declare that 619 00:39:43,920 --> 00:39:47,290 good person to be bad, or you've got a bad person, and 620 00:39:47,290 --> 00:39:48,380 you declare him to be good. 621 00:39:48,380 --> 00:39:50,240 You got a choice between which of those errors 622 00:39:50,240 --> 00:39:50,940 you're going to make. 623 00:39:50,940 --> 00:39:54,080 How many people vote for Type 1? 624 00:39:54,080 --> 00:39:56,280 How many people vote for Type 2? 625 00:39:56,280 --> 00:39:56,780 OK. 626 00:39:56,780 --> 00:39:58,820 So a Type 2 person. 627 00:39:58,820 --> 00:40:03,260 Why do you you prefer the Type 2 error? 628 00:40:07,390 --> 00:40:08,810 Any Type 2 person? 629 00:40:08,810 --> 00:40:10,060 AUDIENCE: [INAUDIBLE] 630 00:40:12,510 --> 00:40:14,190 PROFESSOR: You're missing out on good people. 631 00:40:14,190 --> 00:40:17,090 That's the nice person answer. 632 00:40:17,090 --> 00:40:19,710 You don't want to inadvertently tar some nice, 633 00:40:19,710 --> 00:40:25,060 good person with the label of being bad. 634 00:40:25,060 --> 00:40:28,800 How about a Type 1 person? 635 00:40:28,800 --> 00:40:34,950 AUDIENCE: [INAUDIBLE] 636 00:40:34,950 --> 00:40:35,430 PROFESSOR: There's a lot of people 637 00:40:35,430 --> 00:40:36,890 who can be your friends. 638 00:40:36,890 --> 00:40:37,840 You don't need all of them. 639 00:40:37,840 --> 00:40:39,320 And you want to get to those bad people. 640 00:40:39,320 --> 00:40:39,970 How come? 641 00:40:39,970 --> 00:40:40,870 Anybody else? 642 00:40:40,870 --> 00:40:44,770 AUDIENCE: It could be harmful. 643 00:40:44,770 --> 00:40:45,220 PROFESSOR: Yeah. 644 00:40:45,220 --> 00:40:47,990 it could be dangerous. 645 00:40:47,990 --> 00:40:52,580 If we dichotomise this into good and really bad, nasty, 646 00:40:52,580 --> 00:40:56,990 dangerous people, then that intuition becomes a little 647 00:40:56,990 --> 00:41:00,140 clearer that you whatever you else you do, you want to keep 648 00:41:00,140 --> 00:41:01,390 these people away. 649 00:41:04,820 --> 00:41:07,650 This is applied signal detection theory. 650 00:41:07,650 --> 00:41:13,090 Usually you do signal detection theory in visual 651 00:41:13,090 --> 00:41:14,880 perception land or something. 652 00:41:14,880 --> 00:41:21,290 But this is what the next page of the handout has on it. 653 00:41:21,290 --> 00:41:24,720 But here's where this is coming from. 654 00:41:24,720 --> 00:41:27,840 Let us suppose, again, for the sake of vast over- 655 00:41:27,840 --> 00:41:42,250 simplification, that there are on a scale of goodness, that 656 00:41:42,250 --> 00:41:44,620 the only two types of people in the world. 657 00:41:44,620 --> 00:41:53,570 There are good people and bad people. 658 00:41:53,570 --> 00:41:59,420 And, you want to pick the good people and 659 00:41:59,420 --> 00:42:00,750 reject the bad people. 660 00:42:00,750 --> 00:42:07,230 But the difficulty is that you can't successfully see this, 661 00:42:07,230 --> 00:42:09,060 because your information is lousy. 662 00:42:09,060 --> 00:42:13,790 The effect of your information being lousy is that what you 663 00:42:13,790 --> 00:42:23,070 see is a distribution of goodness and badness, 664 00:42:23,070 --> 00:42:24,320 something like that. 665 00:42:27,390 --> 00:42:29,960 By the way, if you were doing this in vision land, this 666 00:42:29,960 --> 00:42:34,110 would be one light and another light. 667 00:42:34,110 --> 00:42:36,040 Can you tell the difference between a dim light and a 668 00:42:36,040 --> 00:42:38,090 bright light or something like that. 669 00:42:38,090 --> 00:42:40,920 By the time it goes through your nervous system, rather 670 00:42:40,920 --> 00:42:43,830 than being always looking exactly like this, or always 671 00:42:43,830 --> 00:42:45,990 looking exactly like this, sometimes the bright light 672 00:42:45,990 --> 00:42:48,440 looks a little dim, sometimes the dim light 673 00:42:48,440 --> 00:42:49,280 looks a little bright. 674 00:42:49,280 --> 00:42:51,540 And how do you decide which one you've seen? 675 00:42:51,540 --> 00:42:55,010 So, you got a person in front of you. 676 00:42:55,010 --> 00:42:57,230 How can you decide whether they're good or bad? 677 00:42:57,230 --> 00:42:59,840 Well, the best you can do, is put a 678 00:42:59,840 --> 00:43:01,490 criterion in there somewhere. 679 00:43:01,490 --> 00:43:06,010 So let's just divide it. 680 00:43:06,010 --> 00:43:09,110 If I do that, that's OK. 681 00:43:09,110 --> 00:43:14,000 That means I'm going to declare everybody on this side 682 00:43:14,000 --> 00:43:21,330 to be good, and everybody on this side to be bad. 683 00:43:21,330 --> 00:43:25,320 And, OK, so I'm declaring all of these people, who are, in 684 00:43:25,320 --> 00:43:29,550 fact, good to be good. 685 00:43:29,550 --> 00:43:37,280 The difficulty, the sad thing, is the good people who I'm 686 00:43:37,280 --> 00:43:41,010 declaring to be bad, so the Type 1 errors are here. 687 00:43:45,570 --> 00:43:47,710 These are good people who I declared to be bad. 688 00:43:47,710 --> 00:43:49,710 That's too bad. 689 00:43:49,710 --> 00:43:51,410 OK. 690 00:43:51,410 --> 00:43:55,300 Here, on this side are all the bad people who I 691 00:43:55,300 --> 00:43:56,290 declared to be bad. 692 00:43:56,290 --> 00:43:57,910 That's exactly what I wanted to do. 693 00:43:57,910 --> 00:44:06,080 But these guys, these are the Type 2 errors. 694 00:44:06,080 --> 00:44:10,750 These are bad people who beat my criterion level, and I said 695 00:44:10,750 --> 00:44:12,640 they're good. 696 00:44:12,640 --> 00:44:16,240 Now, you should be able to figure out, looking at a 697 00:44:16,240 --> 00:44:20,790 picture like this, there's no way to eliminate error. 698 00:44:20,790 --> 00:44:25,400 If this is the situation of the stimuli I have to deal 699 00:44:25,400 --> 00:44:26,890 with, there's no way to eliminate error. 700 00:44:26,890 --> 00:44:29,740 All I can do is apportion error. 701 00:44:29,740 --> 00:44:34,350 So, if I decide that the Type 2 errors are the dangerous 702 00:44:34,350 --> 00:44:38,410 errors that I need to avoid, then I'm going to move my 703 00:44:38,410 --> 00:44:40,440 criterion over. 704 00:44:40,440 --> 00:44:45,330 This is the second picture on the handout. 705 00:44:45,330 --> 00:44:53,810 If I move my criterion over so that I reduce my Type 2 errors 706 00:44:53,810 --> 00:44:59,760 to just these few, let's say, now the result is that I've 707 00:44:59,760 --> 00:45:03,790 massively increased my Type 1 errors. 708 00:45:03,790 --> 00:45:07,750 I'm now declaring all these lovely people to be people I 709 00:45:07,750 --> 00:45:09,890 don't want to make friends with. 710 00:45:09,890 --> 00:45:13,320 It's sad, but that's the way it go. 711 00:45:13,320 --> 00:45:16,680 Because, as the gentleman back there said, yeah. 712 00:45:16,680 --> 00:45:18,650 There are a lot of people here, so I've got plenty of 713 00:45:18,650 --> 00:45:19,670 people to be friends with. 714 00:45:19,670 --> 00:45:22,430 And these guys will just have to deal with the fact that 715 00:45:22,430 --> 00:45:23,190 they're not my friend. 716 00:45:23,190 --> 00:45:24,540 That's just the way it is. 717 00:45:24,540 --> 00:45:28,550 But, I'm not letting any of these mean, nasty, rotten 718 00:45:28,550 --> 00:45:32,130 people in, except for these guys. 719 00:45:32,130 --> 00:45:34,490 Most of these guys, I'm going to avoid. 720 00:45:34,490 --> 00:45:34,900 OK. 721 00:45:34,900 --> 00:45:40,730 Now, look what happens when you deal with an out group. 722 00:45:40,730 --> 00:45:43,180 When you deal with a group other than your own. 723 00:45:43,180 --> 00:45:50,800 If the argument is that part of what makes an out group the 724 00:45:50,800 --> 00:45:56,110 out group is the fact that you know less about them. 725 00:45:56,110 --> 00:46:02,160 The way to express that in these sort of signal detection 726 00:46:02,160 --> 00:46:04,810 terms, is as an increase in the noise. 727 00:46:04,810 --> 00:46:07,280 An increase in the spread of these distributions. 728 00:46:07,280 --> 00:46:10,660 So what that's going to end up looking like is you still have 729 00:46:10,660 --> 00:46:12,760 the good people and the bad people. 730 00:46:12,760 --> 00:46:23,100 But now your perception is less accurate. 731 00:46:44,480 --> 00:46:44,960 OK. 732 00:46:44,960 --> 00:46:46,170 That'll do. 733 00:46:46,170 --> 00:46:50,310 So, they now just overlap more, because we just don't 734 00:46:50,310 --> 00:46:51,760 know as much about these people. 735 00:46:51,760 --> 00:46:54,310 You're not as good at -- 736 00:46:54,310 --> 00:46:56,060 you want a trivial example of this? 737 00:46:59,770 --> 00:47:09,240 Let's take wolves, there's an out group you know, the ones 738 00:47:09,240 --> 00:47:11,880 with the big sharp, you know, grandma what big teeth you've 739 00:47:11,880 --> 00:47:14,410 got, kind of wolves. 740 00:47:14,410 --> 00:47:16,970 Maybe there's a good wolf out there somewhere. 741 00:47:16,970 --> 00:47:17,790 A nice wolf. 742 00:47:17,790 --> 00:47:19,850 You know the kind of wolf that we're supposed to have adopted 743 00:47:19,850 --> 00:47:27,390 back in antiquity to make into dogs eventually. 744 00:47:27,390 --> 00:47:32,280 But you meet a wolf on the street, and you don't know 745 00:47:32,280 --> 00:47:34,340 much about him. 746 00:47:34,340 --> 00:47:37,230 Where should you draw your threshold before bringing him 747 00:47:37,230 --> 00:47:40,480 home to play with your six year old? 748 00:47:40,480 --> 00:47:42,020 You're going to draw your threshold out 749 00:47:42,020 --> 00:47:43,940 here somewhere, right? 750 00:47:43,940 --> 00:47:47,510 I don't care if I reject the one nice wolf. 751 00:47:47,510 --> 00:47:52,410 It's just really risky to bring wolves home. 752 00:47:52,410 --> 00:47:54,160 And, that's because you're just really, really ignorant 753 00:47:54,160 --> 00:47:55,830 about wolves, you just don't know much about them. 754 00:47:55,830 --> 00:47:57,800 Maybe if you knew wolves better, you'd know who the 755 00:47:57,800 --> 00:47:58,560 nice ones were. 756 00:47:58,560 --> 00:47:59,300 All right. 757 00:47:59,300 --> 00:48:01,710 With human populations it's obviously much less 758 00:48:01,710 --> 00:48:02,530 dramatic than that. 759 00:48:02,530 --> 00:48:06,910 But you don't know much about these other people, so the 760 00:48:06,910 --> 00:48:10,100 distributions theoretically overlap more. 761 00:48:10,100 --> 00:48:13,300 You still only want to make very few errors where you let 762 00:48:13,300 --> 00:48:15,100 bad people in next you. 763 00:48:15,100 --> 00:48:18,740 So that's going to cause you to move that threshold still 764 00:48:18,740 --> 00:48:20,950 further over in this direction. 765 00:48:20,950 --> 00:48:23,430 Not because you don't like these people. 766 00:48:23,430 --> 00:48:27,110 Understand that there's no explicit bias going on here. 767 00:48:27,110 --> 00:48:32,300 You're just being cautious in this in this story. 768 00:48:32,300 --> 00:48:33,870 You can get explicit bias out of those 769 00:48:33,870 --> 00:48:34,940 the first three factors. 770 00:48:34,940 --> 00:48:37,740 But here, this factor has no explicit bias in it at all. 771 00:48:37,740 --> 00:48:38,930 Just ignorance. 772 00:48:38,930 --> 00:48:42,540 So now you say oh good I'm only letting this percentage 773 00:48:42,540 --> 00:48:47,010 of really bad people and now obviously there's a little 774 00:48:47,010 --> 00:48:48,230 problem here. 775 00:48:48,230 --> 00:48:53,580 Your Type 1 where you reject good people. 776 00:48:57,570 --> 00:49:02,530 Now you have rejected almost the entire population of this 777 00:49:02,530 --> 00:49:04,050 other group. 778 00:49:04,050 --> 00:49:08,060 You know that you're not biased in your heart of 779 00:49:08,060 --> 00:49:11,760 hearts, because you can still say, as it says on the 780 00:49:11,760 --> 00:49:16,790 handouts, this little tale of the distribution some of my 781 00:49:16,790 --> 00:49:22,860 best friends are X. Whatever that out group is. 782 00:49:22,860 --> 00:49:27,070 Some of my best friends are white, black, Christian, 783 00:49:27,070 --> 00:49:30,710 Jewish, whatever the other out group is you're dealing with. 784 00:49:30,710 --> 00:49:34,860 This signal detection story will get you there with some 785 00:49:34,860 --> 00:49:37,460 people in the group who are fine, because they beat your 786 00:49:37,460 --> 00:49:39,560 threshold, and the vast bulk of the rest of it, who 787 00:49:39,560 --> 00:49:48,150 disappear because you're applying the same caution to 788 00:49:48,150 --> 00:49:50,360 an out group, that you were applying to the group that you 789 00:49:50,360 --> 00:49:51,820 knew something about. 790 00:49:51,820 --> 00:50:01,150 So, that's how ignorance can end up being a factor in 791 00:50:01,150 --> 00:50:04,390 producing what looks like biased behavior. 792 00:50:04,390 --> 00:50:08,070 If you compare these two, you'd have to say I'm biased 793 00:50:08,070 --> 00:50:08,780 against this group. 794 00:50:08,780 --> 00:50:12,820 Because 100 of these people, I'm only letting five of them 795 00:50:12,820 --> 00:50:13,760 be my friend. 796 00:50:13,760 --> 00:50:15,260 100 of these people I'm letting 60 797 00:50:15,260 --> 00:50:16,850 of them be my friend. 798 00:50:16,850 --> 00:50:21,690 That's a biased outcome from no explicit bias. 799 00:50:21,690 --> 00:50:27,530 Now this question of explicit versus either no bias or 800 00:50:27,530 --> 00:50:31,040 implicit bias is an interesting one. 801 00:50:31,040 --> 00:50:40,780 You don't necessarily have a clear idea of the biases that 802 00:50:40,780 --> 00:50:41,600 you may have. 803 00:50:41,600 --> 00:50:50,970 One of the more interesting and more disturbing bits -- 804 00:50:50,970 --> 00:50:53,930 there's a thing called an implicit 805 00:50:53,930 --> 00:50:58,460 attitude test, the IAT. 806 00:50:58,460 --> 00:51:03,600 If you want to try this out on yourself, go to 807 00:51:03,600 --> 00:51:07,990 www.prejudice.com I think is the right site. 808 00:51:07,990 --> 00:51:09,860 That is one site. 809 00:51:09,860 --> 00:51:18,150 But if that fails go and find your way to the website 810 00:51:18,150 --> 00:51:19,400 mosuronbanashi at Harvard. 811 00:51:23,220 --> 00:51:25,550 She is one of the leading practitioners of this, and her 812 00:51:25,550 --> 00:51:28,430 website will link you to a place where you can try this 813 00:51:28,430 --> 00:51:30,360 out on yourself. 814 00:51:30,360 --> 00:51:34,520 As the website will tell you, before forewarned. 815 00:51:34,520 --> 00:51:37,510 You may find the results of this experiment to be 816 00:51:37,510 --> 00:51:39,510 disturbing to you. 817 00:51:39,510 --> 00:51:42,010 But, it's well worth trying out yourself. 818 00:51:42,010 --> 00:51:44,240 Now what is this experiment about? 819 00:51:44,240 --> 00:51:49,600 This experiment is, in effect, a version of a Stroop 820 00:51:49,600 --> 00:51:51,450 interference test. 821 00:51:51,450 --> 00:51:55,580 The classic stroop interference experiment is an 822 00:51:55,580 --> 00:51:58,500 experiment where what you do is you see a collection of 823 00:51:58,500 --> 00:52:03,610 words and your job is, whatever, the word says, it's 824 00:52:03,610 --> 00:52:07,770 just to tell me what color the ink is that the word is 825 00:52:07,770 --> 00:52:08,590 written in. 826 00:52:08,590 --> 00:52:15,880 So if I write cat in red ink, you say red. 827 00:52:15,880 --> 00:52:21,000 And, if I write dog, in blue ink, you say, dog. 828 00:52:21,000 --> 00:52:21,620 No. 829 00:52:21,620 --> 00:52:23,330 You say blue. 830 00:52:23,330 --> 00:52:24,680 That's a different interference. 831 00:52:24,680 --> 00:52:33,710 The problem is, that if I write red in blue ink, some 832 00:52:33,710 --> 00:52:37,260 people will simply make the mistake of saying red, and 833 00:52:37,260 --> 00:52:42,380 everybody on average, will be substantially slowed down, 834 00:52:42,380 --> 00:52:48,100 because of an inability to suppress that response. 835 00:52:48,100 --> 00:52:52,110 And if I do red in red ink they'll be speeded up. 836 00:52:52,110 --> 00:52:54,930 So, if the two sources conflict with each other, 837 00:52:54,930 --> 00:52:55,720 you're slowed. 838 00:52:55,720 --> 00:52:59,120 If the two sources agree with each other you're speeded. 839 00:52:59,120 --> 00:52:59,570 OK. 840 00:52:59,570 --> 00:53:07,660 So, here's what you do in an IAT experiment. 841 00:53:07,660 --> 00:53:13,770 What you do, is you tell people, I'm going to show you 842 00:53:13,770 --> 00:53:15,720 some words. 843 00:53:15,720 --> 00:53:20,550 And, if they're good words, they're nice words, you push 844 00:53:20,550 --> 00:53:21,800 this button. 845 00:53:25,400 --> 00:53:32,290 And, if they're nasty words, you push this button. 846 00:53:32,290 --> 00:53:33,730 OK. 847 00:53:33,730 --> 00:53:34,410 No problem. 848 00:53:34,410 --> 00:53:38,040 So nice boink. 849 00:53:38,040 --> 00:53:39,620 Evil. 850 00:53:39,620 --> 00:53:40,010 Boink. 851 00:53:40,010 --> 00:53:40,650 Pain. 852 00:53:40,650 --> 00:53:41,410 Boink. 853 00:53:41,410 --> 00:53:42,570 And so on. 854 00:53:42,570 --> 00:53:43,920 Not very tough. 855 00:53:43,920 --> 00:53:44,640 OK. 856 00:53:44,640 --> 00:53:46,020 Second task. 857 00:53:46,020 --> 00:53:53,680 I'm going to show you some pictures of people, if it's an 858 00:53:53,680 --> 00:53:59,890 old person, I want you to push one button, if it's a young 859 00:53:59,890 --> 00:54:05,350 person, I want you to push another button. 860 00:54:05,350 --> 00:54:09,330 So we put me up there, boink. 861 00:54:09,330 --> 00:54:12,720 Put you up there, boink. 862 00:54:12,720 --> 00:54:17,400 Now what we do, is we do mixed blocks. 863 00:54:17,400 --> 00:54:20,910 Where you're going to see words and pictures together. 864 00:54:20,910 --> 00:54:28,140 And, I'm going to tell you, OK, now if you see a nice 865 00:54:28,140 --> 00:54:33,070 word, or an old face, push this button, if you see a 866 00:54:33,070 --> 00:54:37,200 nasty word, or young face, push this button. 867 00:54:37,200 --> 00:54:40,980 It's just the tasks on top of each other. 868 00:54:40,980 --> 00:54:45,280 Whatever we do in this regard, you'll be slower, because now 869 00:54:45,280 --> 00:54:47,880 you have to keep two rules in mind at once, but what's 870 00:54:47,880 --> 00:54:53,690 striking, is it if you do nice and old, you're significantly 871 00:54:53,690 --> 00:54:57,450 slower than if you do nice and young. 872 00:54:57,450 --> 00:55:02,170 Its as if nice and old maps the same response. 873 00:55:02,170 --> 00:55:06,160 It causes a conflict that doesn't work for you. 874 00:55:06,160 --> 00:55:08,240 And nice and young does. 875 00:55:08,240 --> 00:55:14,820 As if you've got a biased in favor of young over old. 876 00:55:18,340 --> 00:55:21,230 If we ask you to why is this an implicit attitude test if 877 00:55:21,230 --> 00:55:25,210 we give you an explicit attitude test that says what's 878 00:55:25,210 --> 00:55:27,330 your attitude towards young and old people. 879 00:55:27,330 --> 00:55:30,040 You may perfectly well report I love old people. 880 00:55:30,040 --> 00:55:30,660 I love young people. 881 00:55:30,660 --> 00:55:32,610 I love everybody. 882 00:55:32,610 --> 00:55:36,010 But, you'll still come up with this result. 883 00:55:36,010 --> 00:55:38,970 I deliberately did this one, because it doesn't tend to 884 00:55:38,970 --> 00:55:43,850 carry an awful lot emotional loading for people. 885 00:55:43,850 --> 00:55:54,310 But the reason this may be a disturbing test for people to 886 00:55:54,310 --> 00:55:58,520 take -- you should still go off and do it-- is that if I 887 00:55:58,520 --> 00:56:05,270 do nice and black, African American pictures and nasty 888 00:56:05,270 --> 00:56:15,230 and white, regardless of your explicit report, of what you 889 00:56:15,230 --> 00:56:19,100 consider your bias to be, the white population in this 890 00:56:19,100 --> 00:56:23,000 country, will, on average, have slower reaction time to 891 00:56:23,000 --> 00:56:29,040 nice, black pairings then to a pairing of nice and white. 892 00:56:29,040 --> 00:56:31,780 Actually, one of the more interesting and depressing 893 00:56:31,780 --> 00:56:36,730 findings, is that doesn't even completely reverse with an 894 00:56:36,730 --> 00:56:39,180 African American population of subjects. 895 00:56:39,180 --> 00:56:41,570 In the African American population, the last time I 896 00:56:41,570 --> 00:56:45,480 checked on the data, the pairings were roughly equal. 897 00:56:45,480 --> 00:56:49,970 So, the African American population has presumably some 898 00:56:49,970 --> 00:56:55,840 ethnocentric biased in its own favor, an implicit bias in its 899 00:56:55,840 --> 00:56:59,250 own favor, but that's counter balanced, by some 900 00:56:59,250 --> 00:57:03,460 incorporation of overall bias against, and so they come out 901 00:57:03,460 --> 00:57:06,640 is roughly equal. 902 00:57:06,640 --> 00:57:11,600 The debates and the literature about this line of work is 903 00:57:11,600 --> 00:57:16,380 this talking about implicit attitudes which is a notion 904 00:57:16,380 --> 00:57:18,830 that regardless of what we think, we're all racists or 905 00:57:18,830 --> 00:57:19,780 something like that. 906 00:57:19,780 --> 00:57:26,020 Or, is this just say that we have somehow incorporated that 907 00:57:26,020 --> 00:57:29,470 we know, at some level, the biases of the culture, as a 908 00:57:29,470 --> 00:57:32,630 whole, even though, we, ourselves, may not be biased. 909 00:57:32,630 --> 00:57:35,390 It is an interesting question beyond the scope of what I can 910 00:57:35,390 --> 00:57:38,470 talk about today, whether there is a serious difference 911 00:57:38,470 --> 00:57:40,660 between those two. 912 00:57:40,660 --> 00:57:44,880 But, the disturbing finding is, -- and you can do this -- 913 00:57:44,880 --> 00:57:48,440 it's not black, white, old, young, by no means is the 914 00:57:48,440 --> 00:57:49,660 limit on this game. 915 00:57:49,660 --> 00:57:51,530 Once you've got the methodology, you 916 00:57:51,530 --> 00:57:53,550 can do it on anything. 917 00:57:53,550 --> 00:58:00,430 So, shortly after September 11, they started doing these 918 00:58:00,430 --> 00:58:04,680 tests on opinions about nice -- 919 00:58:04,680 --> 00:58:07,470 and you don't need to do with faces either, so if you want 920 00:58:07,470 --> 00:58:13,180 to do Arab versus non- Arab, you can just do it with names. 921 00:58:13,180 --> 00:58:18,670 So, you do Abdul, and Mohammed and so on, and then you do 922 00:58:18,670 --> 00:58:20,370 Chris, and Jane. 923 00:58:20,370 --> 00:58:22,980 And, you find that in the American population, at the 924 00:58:22,980 --> 00:58:28,190 moment, nice and Mohammed is slower then nice and Robert. 925 00:58:28,190 --> 00:58:34,430 It's not surprising, but, it is, nevertheless, disturbing 926 00:58:34,430 --> 00:58:37,790 how easy it is to show these effects. 927 00:58:37,790 --> 00:58:38,590 They're robust. 928 00:58:38,590 --> 00:58:44,670 They show up across populations, and they show up 929 00:58:44,670 --> 00:58:48,790 fairly independent of what people report. 930 00:58:48,790 --> 00:58:54,820 It doesn't matter if you explicitly, and let's assume 931 00:58:54,820 --> 00:58:56,050 that people are reporting honestly. 932 00:58:56,050 --> 00:59:01,430 It doesn't matter if you explicitly have the bias. 933 00:59:01,430 --> 00:59:07,480 You can show something that looks like a bias with a test 934 00:59:07,480 --> 00:59:07,840 of this sort. 935 00:59:07,840 --> 00:59:09,130 Anyway, give it a try. 936 00:59:09,130 --> 00:59:12,530 It's interesting and disturbing. 937 00:59:12,530 --> 00:59:16,750 And the interesting and disturbing department, I will 938 00:59:16,750 --> 00:59:20,380 continue in a minute or two talking about the link between 939 00:59:20,380 --> 00:59:24,000 attitudes and actual behavior. 940 00:59:24,000 --> 00:59:26,650 So, this is a good place for a short break. 941 01:01:31,240 --> 01:01:41,280 So, look. 942 01:01:41,280 --> 01:01:42,970 Bias. 943 01:01:42,970 --> 01:01:49,060 We can presumably agree that bias isn't a good thing, but 944 01:01:49,060 --> 01:01:53,060 if it stays in the realm of private opinion, or if it 945 01:01:53,060 --> 01:01:56,490 stays in the realm of implicit opinion that you don't even 946 01:01:56,490 --> 01:02:02,720 know what you have yourself, it's not exactly front page 947 01:02:02,720 --> 01:02:06,860 news, but we also all know from reading the front page, 948 01:02:06,860 --> 01:02:13,010 that there are regrettably frequent occcasions where one 949 01:02:13,010 --> 01:02:19,080 group is willing to slaughter another group based on very 950 01:02:19,080 --> 01:02:24,700 little more, if anything more, than group identity. 951 01:02:24,700 --> 01:02:34,880 So, an absolutely critical question, is how can people be 952 01:02:34,880 --> 01:02:39,130 moved from attitude to action? 953 01:02:39,130 --> 01:02:44,690 And, the disturbing aspect of what we know from experimental 954 01:02:44,690 --> 01:02:50,920 psych about this, is that it is surprisingly easy to have 955 01:02:50,920 --> 01:02:56,330 your behavior controlled by outside forces. 956 01:02:56,330 --> 01:03:00,550 The experimental work, of course, cannot get people to 957 01:03:00,550 --> 01:03:04,200 go out and slaughter each other. 958 01:03:04,200 --> 01:03:07,340 Nothing like that would be even faintly moral, but rather 959 01:03:07,340 --> 01:03:10,760 like the Clay and Kandinsky experiments. 960 01:03:10,760 --> 01:03:14,220 You can do experiments that show that it's surprisingly 961 01:03:14,220 --> 01:03:21,260 easy to manipulate the situation in ways that changes 962 01:03:21,260 --> 01:03:25,440 behavior in ways, that changes behavior in ways that look at 963 01:03:25,440 --> 01:03:26,920 least a little bit disturbing. 964 01:03:26,920 --> 01:03:31,100 One of the classics, back in the '50's, that gets this 965 01:03:31,100 --> 01:03:36,720 literature going, was done by Ash at Columbia at the time. 966 01:03:36,720 --> 01:03:43,680 I think that the picture is still in the book. 967 01:03:43,680 --> 01:03:46,280 A marvelous collection of male, Columbian 968 01:03:46,280 --> 01:03:49,740 nerds from the '50's. 969 01:03:49,740 --> 01:03:51,500 Anybody read the chapter yet, and happen to 970 01:03:51,500 --> 01:03:52,750 know if it's there? 971 01:03:55,870 --> 01:03:58,660 Here's the basic experiment, here you come into this 972 01:03:58,660 --> 01:04:04,850 experiment, and you're doing an experiment on visual 973 01:04:04,850 --> 01:04:07,280 perception. 974 01:04:07,280 --> 01:04:14,100 Ash shows you a card with three lines on it. 975 01:04:18,970 --> 01:04:23,310 Your job is to say which line is longer. 976 01:04:26,100 --> 01:04:28,480 Now, the odd thing about this, well, you're not an 977 01:04:28,480 --> 01:04:30,350 experimentalist, so why should you care? 978 01:04:30,350 --> 01:04:32,870 But it's a little odd, as an experiment to be 979 01:04:32,870 --> 01:04:34,010 doing this in a group. 980 01:04:34,010 --> 01:04:36,770 But it turns out, you're doing this in a group. 981 01:04:36,770 --> 01:04:43,060 And, so Ash holds up the card, and you say B, and you say B, 982 01:04:43,060 --> 01:04:47,260 and you say B, and everybody said B. Next card. 983 01:04:47,260 --> 01:04:49,910 I'm not going to bother changing my cards, but you get 984 01:04:49,910 --> 01:04:50,930 the basic idea. 985 01:04:50,930 --> 01:04:58,600 On the critical trial, up comes this card. 986 01:04:58,600 --> 01:05:11,990 He says, C. He says C. He says C. She says C. C. C. Now, 987 01:05:11,990 --> 01:05:15,720 we're up to -- 988 01:05:15,720 --> 01:05:17,750 I stuck with her because, she's got glasses. 989 01:05:17,750 --> 01:05:22,860 Because the nerdy Columbia guy has glasses on too. 990 01:05:22,860 --> 01:05:24,870 What does she do? 991 01:05:24,870 --> 01:05:26,370 Well, why did all these guys say C? 992 01:05:26,370 --> 01:05:28,330 Are some kind of morons? 993 01:05:28,330 --> 01:05:30,200 They're all confederates of the experimenter. 994 01:05:30,200 --> 01:05:34,110 The only real subject is this person. 995 01:05:34,110 --> 01:05:41,340 And, the question is, does she say C? 996 01:05:41,340 --> 01:05:50,850 The answer is about a third of the time in the original Ash 997 01:05:50,850 --> 01:05:55,790 experiment, the answer's yes, she says C. It is completely 998 01:05:55,790 --> 01:05:59,970 clear that even when she doesn't say C, she's 999 01:05:59,970 --> 01:06:02,290 uncomfortable. 1000 01:06:02,290 --> 01:06:06,090 This is an experiment on peer pressure. 1001 01:06:06,090 --> 01:06:11,490 And, it's perfectly clear that when everybody else is saying 1002 01:06:11,490 --> 01:06:15,260 C, she's busy taking off her glasses, and checking them, 1003 01:06:15,260 --> 01:06:18,630 and stuff like that, to see what's going on here. 1004 01:06:18,630 --> 01:06:20,340 There's something wrong. 1005 01:06:20,340 --> 01:06:23,950 What do you think manipulates -- 1006 01:06:23,950 --> 01:06:26,140 the standard result is you got about a third of the people 1007 01:06:26,140 --> 01:06:27,640 complying with the pressure. 1008 01:06:27,640 --> 01:06:30,250 What reduces that compliance? 1009 01:06:36,420 --> 01:06:37,620 AUDIENCE: How [INAUDIBLE] 1010 01:06:37,620 --> 01:06:38,870 PROFESSOR: Yeah, sure. 1011 01:06:42,470 --> 01:06:44,010 Presumably it's hard to get the result. 1012 01:06:46,560 --> 01:06:49,220 But, OK if we keep the physical stimuli the same. 1013 01:06:49,220 --> 01:06:50,470 Yes. 1014 01:06:52,120 --> 01:06:53,970 AUDIENCE: [INAUDIBLE] 1015 01:06:53,970 --> 01:06:58,920 PROFESSOR: If somebody picks A, and it just looks noisy. 1016 01:06:58,920 --> 01:07:01,160 I don't know if they ever did that particular manipulation. 1017 01:07:01,160 --> 01:07:03,640 That's an interesting question. 1018 01:07:03,640 --> 01:07:05,040 That might change things. 1019 01:07:05,040 --> 01:07:06,529 AUDIENCE: If they have six or seven people, [INAUDIBLE] 1020 01:07:10,780 --> 01:07:14,770 PROFESSOR: It doesn't take any support. 1021 01:07:14,770 --> 01:07:17,700 You probably know this from arguments with groups of 1022 01:07:17,700 --> 01:07:19,610 people or something. 1023 01:07:19,610 --> 01:07:21,770 It's hard to be the first person to voice 1024 01:07:21,770 --> 01:07:22,590 the minority view. 1025 01:07:22,590 --> 01:07:24,200 It's much easier to be the second person. 1026 01:07:24,200 --> 01:07:25,880 I think I actually have the data from that. 1027 01:07:25,880 --> 01:07:26,380 Yeah. 1028 01:07:26,380 --> 01:07:27,880 One supporter. 1029 01:07:27,880 --> 01:07:32,000 One supporter in the group drops compliance from one 1030 01:07:32,000 --> 01:07:34,130 third to 1/12. 1031 01:07:34,130 --> 01:07:40,190 So, big drop in the amount of compliance. 1032 01:07:40,190 --> 01:07:45,240 The more people who say C, the more likely you are to comply. 1033 01:07:45,240 --> 01:07:48,890 The smaller the group, the less likely you are to comply. 1034 01:07:48,890 --> 01:07:55,070 But, the point is, that even in a matter as seemingly 1035 01:07:55,070 --> 01:08:01,060 straight forward as which line is longer, you can feel that 1036 01:08:01,060 --> 01:08:04,280 pressure from others. 1037 01:08:04,280 --> 01:08:08,270 The most famous experiment in this canon is an experiment 1038 01:08:08,270 --> 01:08:11,560 done by Stanley Milgram. 1039 01:08:11,560 --> 01:08:15,160 And, in that experiment, here's the set up. 1040 01:08:15,160 --> 01:08:23,660 You come into the lab for a study on learning. 1041 01:08:23,660 --> 01:08:26,440 The effects of punishment on learning. 1042 01:08:26,440 --> 01:08:28,250 And, there are two of you. 1043 01:08:28,250 --> 01:08:33,450 And, one of you is going to be the learner and one of you is 1044 01:08:33,450 --> 01:08:36,640 going to be the teacher. 1045 01:08:36,640 --> 01:08:37,040 OK. 1046 01:08:37,040 --> 01:08:39,180 And we're going to decide this randomly. 1047 01:08:39,180 --> 01:08:41,990 Flip a coin. 1048 01:08:41,990 --> 01:08:44,910 You're going to be the teacher today. 1049 01:08:44,910 --> 01:08:47,090 Now, in fact, this is not random. 1050 01:08:47,090 --> 01:08:48,950 The subject is always the teacher. 1051 01:08:48,950 --> 01:08:53,870 The learner is always a stooge of the experiementer and 1052 01:08:53,870 --> 01:08:55,570 here's what happens. 1053 01:08:55,570 --> 01:08:58,640 You're told you going to do some sort of a task, and your 1054 01:08:58,640 --> 01:09:05,000 job, as the teacher, is to give the learner a shock every 1055 01:09:05,000 --> 01:09:07,590 time they make a mistake. 1056 01:09:07,590 --> 01:09:11,075 This was done back in the '60's with this great big hunk 1057 01:09:11,075 --> 01:09:18,200 of electrical equipment, with a gazillion switches on there, 1058 01:09:18,200 --> 01:09:25,770 running from 15 volts to, I think 450 volts, in 15 volt 1059 01:09:25,770 --> 01:09:32,190 increments, with instructive little labels like mild, and 1060 01:09:32,190 --> 01:09:36,850 then, up here somewhere is severe, and by the time, you 1061 01:09:36,850 --> 01:09:41,220 get up here, it's labeled something like XXX. 1062 01:09:44,460 --> 01:09:47,300 The rule is every time you make a mistake, you increase 1063 01:09:47,300 --> 01:09:48,970 the voltage. 1064 01:09:48,970 --> 01:09:53,880 Now, we will give you a 45 volt shock. 1065 01:09:53,880 --> 01:09:57,190 You, the teacher, gets a 45 volt shock, just to 1066 01:09:57,190 --> 01:09:58,430 see what it's like. 1067 01:09:58,430 --> 01:10:02,410 And, a 45 volt shock from this apparatus is mildly 1068 01:10:02,410 --> 01:10:03,110 unpleasant. 1069 01:10:03,110 --> 01:10:05,150 It's nothing you'd want to sign up for. 1070 01:10:05,150 --> 01:10:07,910 It's not going to kill ya, though the suggestion is that 1071 01:10:07,910 --> 01:10:10,500 this might. 1072 01:10:10,500 --> 01:10:16,060 And, that's the rule of the game. 1073 01:10:16,060 --> 01:10:22,010 Now, before doing the experiment Milgram went and 1074 01:10:22,010 --> 01:10:26,620 asked everybody under the sun, what the result would be. 1075 01:10:32,090 --> 01:10:34,480 Well, the answer is that everybody under the sun will 1076 01:10:34,480 --> 01:10:37,030 bail out pretty early here. 1077 01:10:37,030 --> 01:10:39,690 Nobody's going to get them very massive shocks. 1078 01:10:47,860 --> 01:10:52,160 He asked theologians, he asked psychologists, he ask people 1079 01:10:52,160 --> 01:10:57,380 off the street, and everybody agreed this is not going to 1080 01:10:57,380 --> 01:11:01,230 lead to much in the way of shocks. 1081 01:11:01,230 --> 01:11:06,106 And, this made the result of the first experimental a 1082 01:11:06,106 --> 01:11:07,720 little surprising. 1083 01:11:07,720 --> 01:11:11,870 Everybody went all the way through to 450 volts. 1084 01:11:11,870 --> 01:11:15,930 The entire population of subjects in the first study 1085 01:11:15,930 --> 01:11:19,300 went through to 450 volts. 1086 01:11:19,300 --> 01:11:23,320 Now, were they with a thrilled about this? 1087 01:11:23,320 --> 01:11:24,180 No. 1088 01:11:24,180 --> 01:11:26,590 It was also absolutely clear that the subjects were very 1089 01:11:26,590 --> 01:11:28,240 uncomfortable about this. 1090 01:11:28,240 --> 01:11:32,640 And, that they questioned whether they should do this. 1091 01:11:32,640 --> 01:11:36,950 And, Milgram had an absolutely stereotypical, he'd prepared, 1092 01:11:36,950 --> 01:11:39,220 in advance, the response. 1093 01:11:39,220 --> 01:11:42,570 Please continue, the experiment must go on. 1094 01:11:42,570 --> 01:11:43,820 And, that was it. 1095 01:11:43,820 --> 01:11:45,630 You were free to leave. 1096 01:11:45,630 --> 01:11:47,980 Though he didn't explain this to you in great detail. 1097 01:11:47,980 --> 01:11:50,080 But if you said, should I do this? 1098 01:11:50,080 --> 01:11:52,130 He said, please continue. 1099 01:11:52,130 --> 01:11:53,500 The experiment must go on. 1100 01:11:53,500 --> 01:11:55,960 And people did. 1101 01:11:55,960 --> 01:12:00,640 Now, he was a little surprised. 1102 01:12:00,640 --> 01:12:06,060 In the original version, the alleged learner had been taken 1103 01:12:06,060 --> 01:12:07,950 out and put in a different room. 1104 01:12:07,950 --> 01:12:09,510 And Milgram figured, well, look. 1105 01:12:09,510 --> 01:12:13,060 Maybe these people just didn't believe the set up story here. 1106 01:12:13,060 --> 01:12:20,050 So, they moved the alleged learner to a position where he 1107 01:12:20,050 --> 01:12:26,110 was visible and making noises about this. 1108 01:12:29,850 --> 01:12:34,210 He's vigorously protesting as the voltage gets larger. 1109 01:12:34,210 --> 01:12:40,050 At some point, up here, he says he's not 1110 01:12:40,050 --> 01:12:41,440 going to respond anymore. 1111 01:12:48,170 --> 01:12:50,040 So, the experiment's rigged, so that he keeps making 1112 01:12:50,040 --> 01:12:51,080 mistakes as a result. 1113 01:12:51,080 --> 01:12:52,830 No response is considered an error. 1114 01:12:52,830 --> 01:12:54,830 And Milgram is there saying please continue, the 1115 01:12:54,830 --> 01:12:56,400 experiment must go on. 1116 01:12:56,400 --> 01:12:56,560 Oh. 1117 01:12:56,560 --> 01:12:59,320 He also makes useful comments like, I have a heart 1118 01:12:59,320 --> 01:13:01,740 condition, and stuff like that. 1119 01:13:01,740 --> 01:13:04,010 It's pretty vivid stuff. 1120 01:13:04,010 --> 01:13:05,600 So, what happens in that case? 1121 01:13:05,600 --> 01:13:05,750 Well. 1122 01:13:05,750 --> 01:13:06,040 Great. 1123 01:13:06,040 --> 01:13:08,690 No longer do 100% of the subjects go all the way 1124 01:13:08,690 --> 01:13:10,150 through to the end. 1125 01:13:10,150 --> 01:13:14,880 Only 2/3 of them go all the way through with a subject who 1126 01:13:14,880 --> 01:13:20,650 has stopped responding, and has announced for all you know 1127 01:13:20,650 --> 01:13:24,130 you're killing this guy, perhaps. 1128 01:13:24,130 --> 01:13:29,390 This is pretty disturbing kind of stuff. 1129 01:13:29,390 --> 01:13:34,060 It was very disturbing to the the subjects. 1130 01:13:34,060 --> 01:13:38,400 Actually, this is one of the experiments that produces the 1131 01:13:38,400 --> 01:13:43,830 need for informed consent in experimental psychology. 1132 01:13:43,830 --> 01:13:46,990 You didn't just go and hijack people off the street and say, 1133 01:13:46,990 --> 01:13:50,120 you have to be in my experiment. 1134 01:13:50,120 --> 01:13:52,080 These people were volunteers. 1135 01:13:52,080 --> 01:13:55,390 But, there was no sort of consent process the way we 1136 01:13:55,390 --> 01:13:56,760 would have today. 1137 01:13:56,760 --> 01:14:00,020 And, the level of distress produced in these subjects was 1138 01:14:00,020 --> 01:14:08,260 part of what drove the field to require informed consent. 1139 01:14:08,260 --> 01:14:10,190 Why was the experiment done at all? 1140 01:14:10,190 --> 01:14:13,870 This is an experiment done in the '60's, less than a 1141 01:14:13,870 --> 01:14:17,120 generation after World War II. 1142 01:14:17,120 --> 01:14:19,990 And, a question that had obsessed social psychology, 1143 01:14:19,990 --> 01:14:23,450 since World War II, was how could the Nazi 1144 01:14:23,450 --> 01:14:25,460 atrocities have happened? 1145 01:14:25,460 --> 01:14:29,250 Who were the people who went and killed 1146 01:14:29,250 --> 01:14:31,010 millions of other people? 1147 01:14:31,010 --> 01:14:34,470 Not the soldiers, but the people who killed six million 1148 01:14:34,470 --> 01:14:38,500 Jews, and I don't remember, a million and a half gypsies, 1149 01:14:38,500 --> 01:14:41,810 and some large number of gays and so on. 1150 01:14:41,810 --> 01:14:43,910 Who were these people? 1151 01:14:43,910 --> 01:14:49,000 There was a theory out there that encapsulated in a book 1152 01:14:49,000 --> 01:14:51,490 called The Authoritarian Personality which among other 1153 01:14:51,490 --> 01:14:56,720 things, fed into stereotypes about the Germans which was 1154 01:14:56,720 --> 01:15:00,770 that there was a certain type of person, who was willing -- 1155 01:15:00,770 --> 01:15:03,850 the Nuremberg trials, the war crime trials after the war, 1156 01:15:03,850 --> 01:15:07,170 had produced over and over again, the line, "I was just 1157 01:15:07,170 --> 01:15:10,710 following orders". 1158 01:15:10,710 --> 01:15:12,880 And, the notion was, well there are some people who are 1159 01:15:12,880 --> 01:15:13,700 just good it that. 1160 01:15:13,700 --> 01:15:16,500 They just follow orders. 1161 01:15:16,500 --> 01:15:19,420 And, the rest of us were not like that. 1162 01:15:19,420 --> 01:15:23,180 Milgram suspected that was not the case. 1163 01:15:23,180 --> 01:15:26,200 Milgram suspected that the answer was that under the 1164 01:15:26,200 --> 01:15:31,220 right situation, many people could be pushed into acts, 1165 01:15:31,220 --> 01:15:36,140 that they would objectively think were impermissible. 1166 01:15:36,140 --> 01:15:43,970 This is his effort to get at that question. 1167 01:15:47,020 --> 01:15:51,660 If this sounds like current events to you, every social 1168 01:15:51,660 --> 01:15:54,990 psychologist with half a credential was on the news 1169 01:15:54,990 --> 01:16:01,090 after the Abu Ghraib Prison Scandals earlier in the year, 1170 01:16:01,090 --> 01:16:04,320 because it just sounded so much like this 1171 01:16:04,320 --> 01:16:05,800 sort of problem again. 1172 01:16:05,800 --> 01:16:11,760 The people who were charged, many of them responded with 1173 01:16:11,760 --> 01:16:16,580 the response, that they were simply if not following 1174 01:16:16,580 --> 01:16:20,720 orders, following the implicit instructions that they felt 1175 01:16:20,720 --> 01:16:22,220 around them. 1176 01:16:22,220 --> 01:16:25,360 You got to endless articles in the papers, saying, Oh, you 1177 01:16:25,360 --> 01:16:28,670 know, so and so was just like everybody else back home. 1178 01:16:28,670 --> 01:16:31,420 I don't understand how he, I don't understand how she, 1179 01:16:31,420 --> 01:16:35,850 could end up in these pictures doing things that any 1180 01:16:35,850 --> 01:16:39,860 reasonable person would say are unacceptable in a in a 1181 01:16:39,860 --> 01:16:42,030 military prison situation. 1182 01:16:42,030 --> 01:16:45,290 It's the same kind of question, different scale of 1183 01:16:45,290 --> 01:16:47,980 magnitude, of course, to the Nazi atrocities. 1184 01:16:47,980 --> 01:16:49,570 But it's the same question. 1185 01:16:49,570 --> 01:16:56,540 How do people end up doing things like this? 1186 01:16:56,540 --> 01:16:58,950 Before attempting to answer, which I can see is going to 1187 01:16:58,950 --> 01:17:02,470 run into my next lecture, let me tell you about a different 1188 01:17:02,470 --> 01:17:05,700 experiment designed to get at the same question. 1189 01:17:05,700 --> 01:17:14,310 This is an experiment where you think you're in a study, 1190 01:17:14,310 --> 01:17:17,510 sort of a consumer relations kind of study, that you might 1191 01:17:17,510 --> 01:17:19,560 run into it the shopping mall. 1192 01:17:19,560 --> 01:17:23,350 They've set up the experiment at a shopping mall. 1193 01:17:23,350 --> 01:17:26,100 And, the cover story is this. 1194 01:17:26,100 --> 01:17:30,650 We're doing some research on community values. 1195 01:17:30,650 --> 01:17:32,320 Because we're basically doing 1196 01:17:32,320 --> 01:17:35,380 investigations for a legal case. 1197 01:17:35,380 --> 01:17:38,010 Here's the situation. 1198 01:17:38,010 --> 01:17:40,510 This guy was living with a woman to 1199 01:17:40,510 --> 01:17:42,910 whom he's not married. 1200 01:17:42,910 --> 01:17:48,160 His employer found out about it, and fired him. 1201 01:17:48,160 --> 01:17:55,720 This has gone to court, and the court case hinges on 1202 01:17:55,720 --> 01:17:57,240 community standards. 1203 01:17:57,240 --> 01:18:01,470 If community standards are that living in sin, out of 1204 01:18:01,470 --> 01:18:04,580 wedlock, is a bad, bad thing, then it's OK to fire him. 1205 01:18:04,580 --> 01:18:05,040 Otherwise not. 1206 01:18:05,040 --> 01:18:07,540 So we've gotta find out what the community standards are. 1207 01:18:07,540 --> 01:18:09,310 Let's all have a discussion. 1208 01:18:09,310 --> 01:18:11,650 So, I've told you the story. 1209 01:18:11,650 --> 01:18:13,530 I've got the cameras rolling here. 1210 01:18:13,530 --> 01:18:16,330 I, the experimenter, I'm going to step out 1211 01:18:16,330 --> 01:18:17,310 and you guys discuss. 1212 01:18:17,310 --> 01:18:17,780 OK. 1213 01:18:17,780 --> 01:18:18,500 You guys discuss. 1214 01:18:18,500 --> 01:18:19,490 That's great. 1215 01:18:19,490 --> 01:18:22,840 Now, I come back and there's a group of ten of you or 1216 01:18:22,840 --> 01:18:24,040 something discussing this. 1217 01:18:24,040 --> 01:18:26,670 I come back in, and I say, that was great. 1218 01:18:26,670 --> 01:18:29,230 Thank you very much. 1219 01:18:29,230 --> 01:18:31,840 I know what you believe, because I've 1220 01:18:31,840 --> 01:18:32,670 been watching this. 1221 01:18:32,670 --> 01:18:37,370 But, I want you, you and you, please to argue from the point 1222 01:18:37,370 --> 01:18:39,770 of view, that the guy should be fired. 1223 01:18:39,770 --> 01:18:41,190 I know it's not what you really believe. 1224 01:18:41,190 --> 01:18:42,620 But just argue for that. 1225 01:18:42,620 --> 01:18:43,900 OK. 1226 01:18:43,900 --> 01:18:46,370 Comes back in. 1227 01:18:46,370 --> 01:18:46,580 OK. 1228 01:18:46,580 --> 01:18:49,950 Now I want you, you, and you to join that argument arguing 1229 01:18:49,950 --> 01:18:51,130 that the guy should be fired. 1230 01:18:51,130 --> 01:18:52,150 Fine. 1231 01:18:52,150 --> 01:18:56,820 And then in the penultimate step, is I want everybody to 1232 01:18:56,820 --> 01:19:02,470 have a chance to look into the camera, and say why the guy 1233 01:19:02,470 --> 01:19:03,120 should be fired. 1234 01:19:03,120 --> 01:19:03,920 I know you don't believe. 1235 01:19:03,920 --> 01:19:07,990 But, just give me a little speech as if you believed it. 1236 01:19:07,990 --> 01:19:08,850 OK. 1237 01:19:08,850 --> 01:19:09,290 Cool. 1238 01:19:09,290 --> 01:19:09,890 OK. 1239 01:19:09,890 --> 01:19:10,900 Last step. 1240 01:19:10,900 --> 01:19:15,030 Here's this statement that says that I can use my 1241 01:19:15,030 --> 01:19:20,790 videotape in any fashion I wish including submitting it 1242 01:19:20,790 --> 01:19:22,950 as evidence in court. 1243 01:19:22,950 --> 01:19:24,200 Will you please sign. 1244 01:19:27,060 --> 01:19:28,070 I'll be back in a minute. 1245 01:19:28,070 --> 01:19:29,560 I gotta go rinse a few things out. 1246 01:19:29,560 --> 01:19:31,670 But you guys decide. 1247 01:19:31,670 --> 01:19:37,440 The question is do people sign? 1248 01:19:37,440 --> 01:19:41,050 I mean this is obviously basically asking you to 1249 01:19:41,050 --> 01:19:47,310 perjure yourself if that wasn't what you believed. 1250 01:19:47,310 --> 01:19:49,940 Now this was a huge experimental design 1251 01:19:49,940 --> 01:19:50,390 originally. 1252 01:19:50,390 --> 01:19:55,170 They were crossing a number of people with gender and they 1253 01:19:55,170 --> 01:19:58,410 originally had a plan for, well I can't remember, a 1254 01:19:58,410 --> 01:19:59,530 gazillion groups. 1255 01:19:59,530 --> 01:20:04,350 They called off the experiment early, because like the Ash 1256 01:20:04,350 --> 01:20:06,610 experiment and like the Milgram experiment, this 1257 01:20:06,610 --> 01:20:09,200 experiment produced extremely strong 1258 01:20:09,200 --> 01:20:12,750 feelings in their subjects. 1259 01:20:12,750 --> 01:20:15,680 They had gotten a form of informed consent, which I 1260 01:20:15,680 --> 01:20:21,320 can't go into now, but even with that, it felt unethical 1261 01:20:21,320 --> 01:20:22,770 to them to keep going. 1262 01:20:22,770 --> 01:20:24,440 So, they had 33 groups. 1263 01:20:24,440 --> 01:20:26,100 It was a busted design. 1264 01:20:26,100 --> 01:20:27,710 But they had 33 groups. 1265 01:20:27,710 --> 01:20:32,010 Of those 33 groups, of those 33 groups, how many did the 1266 01:20:32,010 --> 01:20:35,000 Milgram thing, went all the way and everybody sign? 1267 01:20:35,000 --> 01:20:36,250 Do you think? 1268 01:20:38,040 --> 01:20:40,150 The answer is one. 1269 01:20:40,150 --> 01:20:40,840 I think. 1270 01:20:40,840 --> 01:20:42,790 One group got total obedience. 1271 01:20:42,790 --> 01:20:46,860 16 of the groups unanimous refusal to sign. 1272 01:20:46,860 --> 01:20:52,240 Nine groups got majority refusal to sign. 1273 01:20:52,240 --> 01:20:55,330 So, in this experiment, compliance 1274 01:20:55,330 --> 01:20:58,810 was much, much lower. 1275 01:20:58,810 --> 01:21:03,110 And, the question that I'll take up next time, for the 1276 01:21:03,110 --> 01:21:06,610 start of the next lecture, is what's the difference between 1277 01:21:06,610 --> 01:21:08,030 the Milgram experiment -- 1278 01:21:08,030 --> 01:21:10,690 let me say one last thing about the Milgram experiment. 1279 01:21:10,690 --> 01:21:13,790 This is not an isolated result -- 1280 01:21:13,790 --> 01:21:15,210 Milgram's at Yale. 1281 01:21:15,210 --> 01:21:17,830 It's not just an isolated experiment that happens in New 1282 01:21:17,830 --> 01:21:19,580 Haven in 1960. 1283 01:21:19,580 --> 01:21:22,260 Replicated all over the place. 1284 01:21:22,260 --> 01:21:23,640 Doesn't matter in America. 1285 01:21:23,640 --> 01:21:25,760 Doesn't matter age, sex of the subjects. 1286 01:21:25,760 --> 01:21:27,550 It replicates beautifully. 1287 01:21:27,550 --> 01:21:29,960 Why does this work, and the other experiment didn't?