1 00:00:00,000 --> 00:00:06,975 All right. 2 00:00:06,975 --> 00:00:08,850 So just a couple of announcements real quick. 3 00:00:08,850 --> 00:00:17,190 Can you guys turn this down a little bit? 4 00:00:17,190 --> 00:00:20,720 So, for tomorrow, you guys should all 5 00:00:20,720 --> 00:00:23,720 be ready to give these short presentations on your design 6 00:00:23,720 --> 00:00:25,340 projects in recitation. 7 00:00:25,340 --> 00:00:29,400 We posted a one-page guide talking 8 00:00:29,400 --> 00:00:33,350 about what it is that you should prepare for next time in class. 9 00:00:33,350 --> 00:00:34,870 These are short presentation. 10 00:00:34,870 --> 00:00:37,245 You shouldn't need to put too much time into getting them 11 00:00:37,245 --> 00:00:39,130 together, but please look at the Web page 12 00:00:39,130 --> 00:00:40,990 if you're not sure of what it is that you're 13 00:00:40,990 --> 00:00:43,230 supposed to be doing in class. 14 00:00:43,230 --> 00:00:45,810 Today what we're going to do is wrap up 15 00:00:45,810 --> 00:00:48,010 our discussion of security. 16 00:00:48,010 --> 00:00:49,650 And what I want to do is start off 17 00:00:49,650 --> 00:00:53,960 with a fun little diversion which 18 00:00:53,960 --> 00:00:56,720 is over the past couple of times that you guys were in class 19 00:00:56,720 --> 00:01:00,760 we captured a log of all of the network traffic 20 00:01:00,760 --> 00:01:04,900 that was going on while people were in class on their laptops. 21 00:01:04,900 --> 00:01:06,830 And, for the most part, it turned out 22 00:01:06,830 --> 00:01:09,000 that you guys were pretty boring. 23 00:01:09,000 --> 00:01:10,870 Nobody was actually doing anything exciting, 24 00:01:10,870 --> 00:01:13,500 but there were a couple of interesting lessons. 25 00:01:13,500 --> 00:01:21,810 In case you're curious as to what we actually did, 26 00:01:21,810 --> 00:01:24,880 we ran this TCP dump command. 27 00:01:24,880 --> 00:01:27,380 The pseudo command just says run this command 28 00:01:27,380 --> 00:01:30,240 as though you are a super user, as though you are SU. 29 00:01:30,240 --> 00:01:33,460 And then TCP dump just says dump out all the traffic 30 00:01:33,460 --> 00:01:34,920 on the specific network interface. 31 00:01:34,920 --> 00:01:36,878 In this case, it is dumping out all the traffic 32 00:01:36,878 --> 00:01:39,990 on EN1 which on this computer is the wireless network. 33 00:01:39,990 --> 00:01:42,180 So we are just dumping out all of the packets that 34 00:01:42,180 --> 00:01:43,880 were sent out over EM1. 35 00:01:43,880 --> 00:01:47,400 And what you see here is the yellow part 36 00:01:47,400 --> 00:01:51,420 is the sort of header of the packet saying this is an IP 37 00:01:51,420 --> 00:01:55,470 packet and specifying who the destination and recipient 38 00:01:55,470 --> 00:01:56,640 of this packet are. 39 00:01:56,640 --> 00:01:59,920 And then there is some body text, some sort of payload that 40 00:01:59,920 --> 00:02:01,360 is associated with this packet. 41 00:02:01,360 --> 00:02:04,460 In this case, this first packet is some binary data. 42 00:02:04,460 --> 00:02:08,199 We cannot make any sense of it, but if you look at some 43 00:02:08,199 --> 00:02:10,259 of these other packets, for example, 44 00:02:10,259 --> 00:02:15,412 this is a packet which is an HTTP request packet. 45 00:02:15,412 --> 00:02:17,620 What you see is that there, in fact, is English text. 46 00:02:17,620 --> 00:02:21,600 The HTTP protocol uses English text to transmit information, 47 00:02:21,600 --> 00:02:23,470 and so this is specifying some information 48 00:02:23,470 --> 00:02:28,230 about this is the actual HTTP request going out. 49 00:02:28,230 --> 00:02:30,230 In this case, it was going out from this laptop. 50 00:02:30,230 --> 00:02:33,740 I just set this up as to illustrate what TCP dump does. 51 00:02:33,740 --> 00:02:36,290 We ran a command like this while you guys where in class 52 00:02:36,290 --> 00:02:38,123 and captured a trace of what you were doing. 53 00:02:40,540 --> 00:02:42,384 And we ran a little Perl script that 54 00:02:42,384 --> 00:02:43,800 went through this trace of packets 55 00:02:43,800 --> 00:02:46,860 and sucked out all the Web pages that you guys 56 00:02:46,860 --> 00:02:48,350 had been looking at. 57 00:02:48,350 --> 00:02:51,992 And so I have pictures of some of the Web pages. 58 00:02:51,992 --> 00:02:53,700 A lot of it, as I said, is pretty boring. 59 00:02:53,700 --> 00:02:58,070 There is somebody who is looking to buy a micro drive so they 60 00:02:58,070 --> 00:03:02,220 searched for Micro Center cruiser USB high-speed 61 00:03:02,220 --> 00:03:03,950 on Google. 62 00:03:03,950 --> 00:03:07,060 We saw this page coming back from Google 63 00:03:07,060 --> 00:03:09,012 which was the search request. 64 00:03:09,012 --> 00:03:10,970 And, in fact, you sort of poke around some more 65 00:03:10,970 --> 00:03:13,700 and see this person actually found the Micro Center Website 66 00:03:13,700 --> 00:03:16,220 and went to Micro Center and apparently 67 00:03:16,220 --> 00:03:19,444 was trying to purchase this thing. 68 00:03:19,444 --> 00:03:20,860 There were also a couple of people 69 00:03:20,860 --> 00:03:25,290 looking for movies, people who were looking for a movie 70 00:03:25,290 --> 00:03:25,790 to see. 71 00:03:25,790 --> 00:03:27,650 In fact, we actually saw this on both days. 72 00:03:27,650 --> 00:03:28,990 It wasn't just one day that somebody 73 00:03:28,990 --> 00:03:30,930 was looking for movies, but somebody keeps coming to class 74 00:03:30,930 --> 00:03:31,900 and looking for movies. 75 00:03:34,860 --> 00:03:37,530 The next one, though, is much more interesting. 76 00:03:37,530 --> 00:03:39,387 There are a couple of Web pages where 77 00:03:39,387 --> 00:03:41,595 somebody is looking at a tool called Gaim-Encryption. 78 00:03:45,250 --> 00:03:52,130 Gaim is the open source AOL Instant Messenger client, 79 00:03:52,130 --> 00:03:57,210 and Gaim-Encryption is a version of the open source Gaim 80 00:03:57,210 --> 00:03:58,770 client that encrypts data that gets 81 00:03:58,770 --> 00:04:02,080 transmitted over the network. 82 00:04:02,080 --> 00:04:06,230 And the way it works is when you install this tool it generates 83 00:04:06,230 --> 00:04:07,850 you a public/private key pair. 84 00:04:07,850 --> 00:04:10,016 And then, when you have a chat session with somebody 85 00:04:10,016 --> 00:04:13,380 else who is also using this Gaim-Encryption tool, 86 00:04:13,380 --> 00:04:15,610 it sends that person your public key 87 00:04:15,610 --> 00:04:20,660 and then it uses public/private key encryption and signing 88 00:04:20,660 --> 00:04:23,130 in order to protect information that gets transmitted 89 00:04:23,130 --> 00:04:23,844 over the network. 90 00:04:23,844 --> 00:04:25,510 So there was somebody who, during class, 91 00:04:25,510 --> 00:04:27,030 was looking at this tool. 92 00:04:27,030 --> 00:04:29,560 We see they go to the project page, sourceforge, 93 00:04:29,560 --> 00:04:31,134 where this tool is hosted. 94 00:04:31,134 --> 00:04:33,550 And they are clicking around looking at what the tool does 95 00:04:33,550 --> 00:04:34,580 and reading about it. 96 00:04:34,580 --> 00:04:36,830 And then you see that they download this tool. 97 00:04:36,830 --> 00:04:39,250 And, in fact, after that, at some point, 98 00:04:39,250 --> 00:04:42,570 there is a whole lot of AOL Instant Messenger traffic 99 00:04:42,570 --> 00:04:44,330 that is going out on the network. 100 00:04:44,330 --> 00:04:47,860 I am just surmising that this traffic actually 101 00:04:47,860 --> 00:04:50,367 belongs to this person who downloaded this tool. 102 00:04:50,367 --> 00:04:52,950 And you can see that, in fact, the tool appears to be working. 103 00:04:52,950 --> 00:04:56,000 So this traffic is not intelligible. 104 00:04:56,000 --> 00:04:58,900 And, in other cases, when you see this tool actually 105 00:04:58,900 --> 00:05:00,730 goes across the network in plain text, 106 00:05:00,730 --> 00:05:03,420 the Gaim normally runs in plain text 107 00:05:03,420 --> 00:05:06,000 and you can just see the entire transcript of the chat 108 00:05:06,000 --> 00:05:08,670 happening if you run this tool. 109 00:05:08,670 --> 00:05:11,730 There is a very interesting security lesson here 110 00:05:11,730 --> 00:05:14,610 which is that sometime later, in this particular trace 111 00:05:14,610 --> 00:05:17,070 of a chat, we see this same person. 112 00:05:17,070 --> 00:05:19,980 Person one, their name is transmitted in plain text. 113 00:05:19,980 --> 00:05:23,159 You see this same person even when using the Gaim tool. 114 00:05:23,159 --> 00:05:24,950 You don't see their messages in plain text, 115 00:05:24,950 --> 00:05:27,700 but you see this person now actually 116 00:05:27,700 --> 00:05:30,180 is talking, having a conversation. 117 00:05:30,180 --> 00:05:32,990 And they sort of say oh, I download this Gaim tool, 118 00:05:32,990 --> 00:05:34,984 now I can talk encrypted to other people. 119 00:05:34,984 --> 00:05:36,400 And yet there is this conversation 120 00:05:36,400 --> 00:05:39,480 where it is not encrypted, and this sort of dialogue goes on. 121 00:05:39,480 --> 00:05:41,734 They are talking about somebody who is in class 122 00:05:41,734 --> 00:05:43,900 and say why doesn't this person go to class anymore? 123 00:05:43,900 --> 00:05:46,358 And they are having this little chat session back and forth 124 00:05:46,358 --> 00:05:48,680 about why this person isn't in class. 125 00:05:48,680 --> 00:05:55,732 And the dialogue goes on talking about their design project 126 00:05:55,732 --> 00:05:56,940 and they're worried about it. 127 00:05:56,940 --> 00:06:03,520 It is sort of a nice example of a security 128 00:06:03,520 --> 00:06:05,860 problem or a potential security problem, 129 00:06:05,860 --> 00:06:08,194 which is that this person actually did something 130 00:06:08,194 --> 00:06:09,110 that was sort of nice. 131 00:06:09,110 --> 00:06:10,610 They came to security class, they 132 00:06:10,610 --> 00:06:11,730 are learning about security in class, 133 00:06:11,730 --> 00:06:13,271 they go download this tool and think, 134 00:06:13,271 --> 00:06:15,880 OK, I've got this tool installed and now my conversations 135 00:06:15,880 --> 00:06:16,780 are protected. 136 00:06:16,780 --> 00:06:18,410 And then, as soon as they try and talk 137 00:06:18,410 --> 00:06:19,827 to somebody who isn't also running 138 00:06:19,827 --> 00:06:21,701 this version of the tool, their conversations 139 00:06:21,701 --> 00:06:23,160 are no longer protected anymore. 140 00:06:23,160 --> 00:06:25,705 This is sort of a nice example of why security is actually 141 00:06:25,705 --> 00:06:27,080 a really hard thing to get right. 142 00:06:27,080 --> 00:06:29,404 Because this person may have even believed 143 00:06:29,404 --> 00:06:30,820 that this person they were talking 144 00:06:30,820 --> 00:06:33,390 to had this tool installed as well, 145 00:06:33,390 --> 00:06:37,200 but you sort of don't know that it is not working 146 00:06:37,200 --> 00:06:39,700 until somebody points out that it is not working, 147 00:06:39,700 --> 00:06:43,920 somebody violates, goes in and is 148 00:06:43,920 --> 00:06:45,910 able to access this on unprotected data. 149 00:06:45,910 --> 00:06:47,970 This is a nice example of why security is hard 150 00:06:47,970 --> 00:06:50,250 and why sort of providing protection 151 00:06:50,250 --> 00:06:51,710 is essentially a negative role. 152 00:06:51,710 --> 00:06:53,126 You never really know that you are 153 00:06:53,126 --> 00:06:56,620 protected until somebody points out that you are not protected. 154 00:06:56,620 --> 00:07:01,840 That was sort of a little lesson about security, 155 00:07:01,840 --> 00:07:05,420 and I hope that it sort of pointing out a reason in which 156 00:07:05,420 --> 00:07:06,925 why it is that security is something 157 00:07:06,925 --> 00:07:08,800 that is both difficult and something that you 158 00:07:08,800 --> 00:07:10,467 might want to be conscious of. 159 00:07:10,467 --> 00:07:12,800 Often times when you're sitting there using your laptop, 160 00:07:12,800 --> 00:07:16,237 you have this perception that is my information I am sending, 161 00:07:16,237 --> 00:07:18,570 nobody is going to be able to overhear this information. 162 00:07:18,570 --> 00:07:20,990 It is going so fast and being transmitted so instantly 163 00:07:20,990 --> 00:07:22,750 that, of course, I am private. 164 00:07:22,750 --> 00:07:24,424 You feel very sort of isolated when 165 00:07:24,424 --> 00:07:26,840 you're having this communication until somebody points out 166 00:07:26,840 --> 00:07:28,320 that it is actually very easy, I mean 167 00:07:28,320 --> 00:07:30,810 this is a tool that is standard on all Linux distributions. 168 00:07:30,810 --> 00:07:33,780 And it is one line that we type into our computer 169 00:07:33,780 --> 00:07:34,840 to get this information. 170 00:07:34,840 --> 00:07:36,340 It is actually very easy to do this, 171 00:07:36,340 --> 00:07:38,120 so it is something that when you are transmitting 172 00:07:38,120 --> 00:07:40,640 sensitive information out over the network over an email 173 00:07:40,640 --> 00:07:43,450 or in an Instant Messenger you might want to actually think 174 00:07:43,450 --> 00:07:46,370 about whether or not you want to be sending 175 00:07:46,370 --> 00:07:50,960 this information out in the way that you're sending it out. 176 00:07:50,960 --> 00:07:57,220 What I want to do now is come back to something we were 177 00:07:57,220 --> 00:08:00,820 talking about last time, finish up our discussion a little bit 178 00:08:00,820 --> 00:08:04,510 of authorization and then talk about how we can start thinking 179 00:08:04,510 --> 00:08:10,010 about designing protocols that actually provide -- 180 00:08:10,010 --> 00:08:11,700 How we can reason about these security 181 00:08:11,700 --> 00:08:13,400 protocols that we are using. 182 00:08:13,400 --> 00:08:18,530 In particular, how we can reason about authentication protocols 183 00:08:18,530 --> 00:08:19,530 within networks. 184 00:08:19,530 --> 00:08:25,220 If you remember last time, we talked 185 00:08:25,220 --> 00:08:29,870 about this idea of building up a secure communication channel 186 00:08:29,870 --> 00:08:37,390 and then we talked about doing authorization 187 00:08:37,390 --> 00:08:38,419 on top of that channel. 188 00:08:41,164 --> 00:08:42,789 Remember, authorization is this process 189 00:08:42,789 --> 00:08:46,350 of determining whether a given user should have access 190 00:08:46,350 --> 00:08:49,112 to a particular resource that they want access to. 191 00:08:49,112 --> 00:08:50,570 We talked about two main techniques 192 00:08:50,570 --> 00:08:54,380 for doing this, ACLs and tickets. 193 00:08:56,970 --> 00:08:59,520 An ACL is an access control list which 194 00:08:59,520 --> 00:09:04,450 is just a list of users who are authorized to access 195 00:09:04,450 --> 00:09:05,460 a particular resource. 196 00:09:05,460 --> 00:09:06,876 And so the way typically an access 197 00:09:06,876 --> 00:09:09,300 control list works is that the system first authenticates 198 00:09:09,300 --> 00:09:11,810 the user to determine what their identify is 199 00:09:11,810 --> 00:09:16,830 and then it checks to see whether that identity is 200 00:09:16,830 --> 00:09:18,410 actually on this access control list 201 00:09:18,410 --> 00:09:21,960 and the user is authorized to access this particular resource 202 00:09:21,960 --> 00:09:23,199 they want access to. 203 00:09:23,199 --> 00:09:24,990 Now, ticket has a slightly different flavor 204 00:09:24,990 --> 00:09:28,060 which is typically if a user presents 205 00:09:28,060 --> 00:09:30,670 a ticket you do not need to actually check and see 206 00:09:30,670 --> 00:09:31,850 what their identity is. 207 00:09:31,850 --> 00:09:34,200 Simply having the ticket is sufficient to allow somebody 208 00:09:34,200 --> 00:09:35,810 to have access to a resource. 209 00:09:35,810 --> 00:09:38,870 The analogy here is very similar to a physical ticket 210 00:09:38,870 --> 00:09:39,620 in the real world. 211 00:09:39,620 --> 00:09:41,070 You go to a concert, you hand them a ticket 212 00:09:41,070 --> 00:09:42,445 and they let you into the concert 213 00:09:42,445 --> 00:09:46,624 without checking your ID. 214 00:09:46,624 --> 00:09:48,540 A common example of a ticket in the real world 215 00:09:48,540 --> 00:09:52,840 is on the Web you get these cookies which are like tickets. 216 00:09:52,840 --> 00:09:54,956 And the reason you use cookies instead 217 00:09:54,956 --> 00:09:56,830 of checking access control list every time is 218 00:09:56,830 --> 00:09:58,860 that when you have a particular cookie 219 00:09:58,860 --> 00:10:01,200 the system doesn't have to re-authenticate you in order 220 00:10:01,200 --> 00:10:03,366 to allow you to have access to the system every time 221 00:10:03,366 --> 00:10:04,660 you revisit a page. 222 00:10:04,660 --> 00:10:07,200 I have a cookie for Amazon dot com 223 00:10:07,200 --> 00:10:12,260 on my computer when I am in the process of purchasing 224 00:10:12,260 --> 00:10:14,500 some item from Amazon dot com that identified 225 00:10:14,500 --> 00:10:16,170 my identity and my shopping cart and I 226 00:10:16,170 --> 00:10:19,510 don't have to sort of re-log in every time I request 227 00:10:19,510 --> 00:10:21,440 a new secure page from Amazon. 228 00:10:21,440 --> 00:10:33,080 Specifically, what we are talking about is the Web. 229 00:10:33,080 --> 00:10:39,780 And we had a browser and a Web server. 230 00:10:39,780 --> 00:10:42,650 And the browser and the Web server 231 00:10:42,650 --> 00:10:45,730 were talking over one of these secure communication channels. 232 00:10:50,760 --> 00:10:57,900 And we said this channel has both been authenticated 233 00:10:57,900 --> 00:10:59,900 and is confidential. 234 00:11:05,240 --> 00:11:09,090 Authorization, we said, happens where there is 235 00:11:09,090 --> 00:11:11,460 some guard process in charge. 236 00:11:11,460 --> 00:11:15,480 This is our guard. 237 00:11:15,480 --> 00:11:18,090 And it sits in front of the services on the system. 238 00:11:20,820 --> 00:11:23,410 When a request comes in to access a service, 239 00:11:23,410 --> 00:11:25,010 the Web server passes that request 240 00:11:25,010 --> 00:11:27,740 onto the guard module which does the authorization. 241 00:11:27,740 --> 00:11:30,100 And then, if the request is authorized, 242 00:11:30,100 --> 00:11:32,260 it goes ahead and passes it onto the service. 243 00:11:46,500 --> 00:11:50,580 We talked about this notion of how we establish 244 00:11:50,580 --> 00:11:52,560 a secure communication channel. 245 00:11:52,560 --> 00:11:54,700 Typically, we do that by exchanging, 246 00:11:54,700 --> 00:11:57,910 one way that we talk about doing this is where B, for example, 247 00:11:57,910 --> 00:12:00,620 would discover W's public key somehow and then use 248 00:12:00,620 --> 00:12:05,520 that public key in order to exchange with W a shared 249 00:12:05,520 --> 00:12:07,230 secret key that they could use to encrypt 250 00:12:07,230 --> 00:12:08,230 all their communication. 251 00:12:08,230 --> 00:12:11,594 I will show that protocol again in a minute, but the question, 252 00:12:11,594 --> 00:12:14,010 and the question that we are really going to get at today, 253 00:12:14,010 --> 00:12:16,770 we want to focus in more on this question of how 254 00:12:16,770 --> 00:12:20,680 it is that B actually discovers W's 255 00:12:20,680 --> 00:12:23,950 public key in a public key system 256 00:12:23,950 --> 00:12:29,780 and how it is that B can convince itself 257 00:12:29,780 --> 00:12:35,719 that, in fact, it is actually talking with this service W. 258 00:12:35,719 --> 00:12:37,760 If B wants to have a conversation with Amazon dot 259 00:12:37,760 --> 00:12:39,850 com, how does it actually convince itself 260 00:12:39,850 --> 00:12:43,997 that this public key that it has, in fact, belongs to Amazon 261 00:12:43,997 --> 00:12:45,580 dot com and that it is actually having 262 00:12:45,580 --> 00:12:48,242 a conversation with Amazon dot com instead of with somebody 263 00:12:48,242 --> 00:12:49,700 else who is pretending to be Amazon 264 00:12:49,700 --> 00:12:53,090 dot com or has lied about Amazon dot com's public key 265 00:12:53,090 --> 00:12:55,890 so that it can overhear Amazon dot com's traffic. 266 00:12:55,890 --> 00:12:59,240 We are going to focus in more on that now. 267 00:12:59,240 --> 00:13:07,410 Let's look at this simplified version of an authentication 268 00:13:07,410 --> 00:13:10,090 protocol. 269 00:13:10,090 --> 00:13:12,350 This is going to be like the Denning-Sacco Protocol 270 00:13:12,350 --> 00:13:13,891 that I showed with the broken example 271 00:13:13,891 --> 00:13:17,120 or like SSL, for example, on the Internet also sort of works 272 00:13:17,120 --> 00:13:19,230 roughly in this way. 273 00:13:19,230 --> 00:13:22,540 We've got our browser, we've got our Web server 274 00:13:22,540 --> 00:13:24,920 and we've got some certificate authority. 275 00:13:24,920 --> 00:13:26,910 This is the kind of picture we drew before. 276 00:13:26,910 --> 00:13:29,370 We said the browser sends to the certificate authority 277 00:13:29,370 --> 00:13:33,604 a request for, say, the public key of W. 278 00:13:33,604 --> 00:13:35,520 The certificate authority sends a message back 279 00:13:35,520 --> 00:13:42,800 with has the certificate for B and the certificate for W. 280 00:13:42,800 --> 00:13:45,374 And these are simply signed things 281 00:13:45,374 --> 00:13:46,790 that the certificate authority has 282 00:13:46,790 --> 00:13:49,390 signed that include the public key for B and W in them, 283 00:13:49,390 --> 00:13:51,110 so that is all certificates are. 284 00:13:51,110 --> 00:13:53,072 And then B sends a message to W. 285 00:13:53,072 --> 00:13:55,530 And there is some trickiness in getting this message right, 286 00:13:55,530 --> 00:13:58,400 but this message basically is a signed and encrypted message 287 00:13:58,400 --> 00:14:00,810 that tells W that it would like to initiate 288 00:14:00,810 --> 00:14:02,050 a conversation with it. 289 00:14:02,050 --> 00:14:04,500 And perhaps it proposes a shared key 290 00:14:04,500 --> 00:14:06,580 that it can use in that conversation. 291 00:14:06,580 --> 00:14:09,505 It is a little bit tricky to actually figure out and make 292 00:14:09,505 --> 00:14:11,380 sure we get the format of that message right, 293 00:14:11,380 --> 00:14:13,796 and that was there is bug with the Denning-Sacco Protocol, 294 00:14:13,796 --> 00:14:15,990 but you get the idea. 295 00:14:15,990 --> 00:14:19,730 So W then sends a message back saying perhaps asking user 296 00:14:19,730 --> 00:14:22,550 to go ahead and trying to authorize the user saying, 297 00:14:22,550 --> 00:14:26,850 OK, I agree that that is the key, now what is your name 298 00:14:26,850 --> 00:14:29,680 and what is your password, for example. 299 00:14:29,680 --> 00:14:32,100 And then, once it has exchanged that name and password, 300 00:14:32,100 --> 00:14:35,300 B is allowed to log into the system and access whatever 301 00:14:35,300 --> 00:14:38,230 these resources B should have access to on the server. 302 00:14:43,020 --> 00:14:44,860 The question that I want to focus on 303 00:14:44,860 --> 00:14:48,200 is this question about how B actually 304 00:14:48,200 --> 00:14:52,530 decides the certificate authority that it should trust. 305 00:14:52,530 --> 00:14:56,820 How does B initially establish the sort of authenticity 306 00:14:56,820 --> 00:14:58,292 of the certificate authority? 307 00:14:58,292 --> 00:14:59,750 When we talked about authentication 308 00:14:59,750 --> 00:15:03,990 before, we presented this model that was sort of, well, 309 00:15:03,990 --> 00:15:07,150 you are going to decide that you trust somebody by, for example, 310 00:15:07,150 --> 00:15:10,000 having a conversation with the. 311 00:15:10,000 --> 00:15:11,700 I am going to run into you in the hall 312 00:15:11,700 --> 00:15:13,492 and you're going to say my public key is X, 313 00:15:13,492 --> 00:15:15,075 and then I am going to be able to have 314 00:15:15,075 --> 00:15:16,190 a conversation with you. 315 00:15:16,190 --> 00:15:18,130 But, clearly, that is not what happens 316 00:15:18,130 --> 00:15:20,140 in the case of the Internet. 317 00:15:20,140 --> 00:15:24,380 For example, on the Internet, I do not 318 00:15:24,380 --> 00:15:27,715 have to call up Amazon dot com or call up my certificate 319 00:15:27,715 --> 00:15:29,590 authority and get the certificate authority's 320 00:15:29,590 --> 00:15:30,662 public key from them. 321 00:15:30,662 --> 00:15:32,370 Somehow I have already gotten this thing. 322 00:15:32,370 --> 00:15:37,730 And so we sort of want to think about how 323 00:15:37,730 --> 00:15:41,310 it is that we can actually exchange these keys 324 00:15:41,310 --> 00:15:44,514 and how it is that these systems can actually decide that they 325 00:15:44,514 --> 00:15:46,930 trust each other and that they are willing to authenticate 326 00:15:46,930 --> 00:15:48,480 each other. 327 00:15:48,480 --> 00:15:51,049 To do that, what we are going to do 328 00:15:51,049 --> 00:15:53,090 is look at something called authentication logic. 329 00:16:05,450 --> 00:16:07,820 The specific version of authentication logic 330 00:16:07,820 --> 00:16:10,980 we are going to talk about is something called BAN logic. 331 00:16:10,980 --> 00:16:13,990 B, A and N just stand for the names of the three people 332 00:16:13,990 --> 00:16:17,420 who proposed this thing so it is not a specific acronym you 333 00:16:17,420 --> 00:16:19,062 need to know. 334 00:16:19,062 --> 00:16:21,020 And so what authentication logic is going to do 335 00:16:21,020 --> 00:16:25,520 is allow us to reason about these protocols 336 00:16:25,520 --> 00:16:30,980 for deciding whether or not we actually 337 00:16:30,980 --> 00:16:35,170 trust something like a given certificate authority. 338 00:16:35,170 --> 00:16:37,390 Just to think about this a little bit more, 339 00:16:37,390 --> 00:16:39,240 let's look at a specific example. 340 00:16:39,240 --> 00:16:45,650 Suppose that you receive a message over the Internet. 341 00:16:45,650 --> 00:16:49,070 Suppose this message says it is some message m 342 00:16:49,070 --> 00:16:54,680 and the message says sign is signed m with some key k, 343 00:16:54,680 --> 00:16:58,400 call it k sub A. 344 00:16:58,400 --> 00:17:02,250 And suppose that the contents of message m are m 345 00:17:02,250 --> 00:17:08,730 equals give A your credit card number. 346 00:17:13,619 --> 00:17:14,940 This might be a valid request. 347 00:17:14,940 --> 00:17:19,700 We might be purchasing something from Amazon dot com. 348 00:17:19,700 --> 00:17:24,599 In that case, you might think it might be OK, 349 00:17:24,599 --> 00:17:30,700 we might expect to receive a request like this. 350 00:17:30,700 --> 00:17:32,930 So this thing was signed with kA and the message 351 00:17:32,930 --> 00:17:34,660 said give A your credit card number, 352 00:17:34,660 --> 00:17:39,340 but how do we actually know, you know, maybe A, in this case, 353 00:17:39,340 --> 00:17:41,580 is give to Amazon dot com and we know that we 354 00:17:41,580 --> 00:17:42,870 are talking to Amazon dot com. 355 00:17:42,870 --> 00:17:48,600 And so if this, in fact, were an authentic message, 356 00:17:48,600 --> 00:17:51,950 we might be actually willing to give Amazon our credit card 357 00:17:51,950 --> 00:17:54,100 number. 358 00:17:54,100 --> 00:17:57,410 How do we actually know, though, that this message, 359 00:17:57,410 --> 00:18:02,334 that kA is the key that belongs to Amazon dot com? 360 00:18:02,334 --> 00:18:04,000 We want to answer the question how do we 361 00:18:04,000 --> 00:18:17,940 know that or trust that kA, we are going to say, speaks for A? 362 00:18:17,940 --> 00:18:19,750 This notation speaks for just means 363 00:18:19,750 --> 00:18:25,160 that if we see something that, for example, is signed with kA 364 00:18:25,160 --> 00:18:30,890 that claims to be from A, we might actually trust, in fact, 365 00:18:30,890 --> 00:18:32,950 that this thing came from A. 366 00:18:32,950 --> 00:18:34,417 And we want to sort of determine, 367 00:18:34,417 --> 00:18:36,500 we are going to try and answer the question of how 368 00:18:36,500 --> 00:18:40,340 do we trust this? 369 00:18:40,340 --> 00:18:46,360 One way that we might trust this is true is if we had heard, 370 00:18:46,360 --> 00:18:53,415 for example, some message, let's call it m2 -- -- 371 00:19:04,570 --> 00:19:08,551 from somebody B that had said that -- 372 00:19:18,190 --> 00:19:21,130 Or this message said kA is A's key. 373 00:19:21,130 --> 00:19:23,870 One way that we might start thinking about, one way that we 374 00:19:23,870 --> 00:19:26,390 might trust A is if we had heard from somebody else 375 00:19:26,390 --> 00:19:32,660 or heard from some other place that kA was A's key. 376 00:19:32,660 --> 00:19:34,946 But now this is kind of a slippery slope 377 00:19:34,946 --> 00:19:36,570 because now you say the question, well, 378 00:19:36,570 --> 00:19:39,280 how did you determine that you trust B's key? 379 00:19:39,280 --> 00:19:40,864 There is sort of this infinite process 380 00:19:40,864 --> 00:19:42,363 that you could end up going through. 381 00:19:42,363 --> 00:19:44,320 And that doesn't sound like a very good idea. 382 00:19:51,360 --> 00:19:53,712 One way in which sort of we might 383 00:19:53,712 --> 00:19:55,170 bottom out this recursion, we might 384 00:19:55,170 --> 00:19:57,500 end this sort of infinite process of collecting 385 00:19:57,500 --> 00:20:00,360 keys is if we actually sort of, in the example I gave you, 386 00:20:00,360 --> 00:20:02,490 heard from one person at some point 387 00:20:02,490 --> 00:20:04,980 suppose there is a B and a C and D and E and an F. 388 00:20:04,980 --> 00:20:06,620 And then eventually we get to F. 389 00:20:06,620 --> 00:20:08,730 F is our friend who we had talked to, 390 00:20:08,730 --> 00:20:12,830 and F had said my public key is this and here is 391 00:20:12,830 --> 00:20:15,370 the public keys for a few people that I know. 392 00:20:15,370 --> 00:20:17,560 So you might trust F and you might 393 00:20:17,560 --> 00:20:19,240 believe that F knows a few other people. 394 00:20:19,240 --> 00:20:21,687 And then those people who F knows 395 00:20:21,687 --> 00:20:23,270 might in turn know a few other people. 396 00:20:23,270 --> 00:20:25,020 And eventually we might be able to sort of 397 00:20:25,020 --> 00:20:29,360 have this web of inner connected people all of whom we trust. 398 00:20:29,360 --> 00:20:32,605 So this would be sort of a "web of trust". 399 00:20:36,891 --> 00:20:39,140 We might be able to build up a web of trust like this, 400 00:20:39,140 --> 00:20:42,410 but you could imagine that these relationships are 401 00:20:42,410 --> 00:20:45,060 pretty complicated. 402 00:20:45,060 --> 00:20:47,940 For example, B might trust a few people 403 00:20:47,940 --> 00:20:50,640 and C might trust a few people and D might trust a few people. 404 00:20:50,640 --> 00:20:53,590 And, every time we're presented with some message, 405 00:20:53,590 --> 00:20:56,000 we need to have some sort of principled way of thinking 406 00:20:56,000 --> 00:21:00,630 about whether we have a set of these keys, 407 00:21:00,630 --> 00:21:02,102 for example, that actually allow us 408 00:21:02,102 --> 00:21:04,060 to decide whether we should accept this message 409 00:21:04,060 --> 00:21:06,570 or believe this message or not. 410 00:21:06,570 --> 00:21:08,340 If we had a web of trust like this, 411 00:21:08,340 --> 00:21:12,030 we would need some way of going about validating or verifying 412 00:21:12,030 --> 00:21:17,760 that, in fact, these messages were messages we should trust. 413 00:21:17,760 --> 00:21:20,870 That is what authentication logic is really all about. 414 00:21:20,870 --> 00:21:22,971 This is the simplified band logic 415 00:21:22,971 --> 00:21:24,220 that is presented in the text. 416 00:21:24,220 --> 00:21:25,639 It is on page 11-85. 417 00:21:25,639 --> 00:21:27,680 If you want to copy it down, I will try and leave 418 00:21:27,680 --> 00:21:29,970 time for you, but in case you want to just copy it out 419 00:21:29,970 --> 00:21:33,200 of the text you might not need to worry about copying down 420 00:21:33,200 --> 00:21:35,300 the exact logic here. 421 00:21:35,300 --> 00:21:38,150 There are three rules in this simplified authentication logic 422 00:21:38,150 --> 00:21:42,250 and they use this notation, this speaks for notation, 423 00:21:42,250 --> 00:21:45,530 and they are also going to use some notation that 424 00:21:45,530 --> 00:21:47,040 says notation. 425 00:21:50,170 --> 00:21:52,580 What this rule says is if A says B speaks 426 00:21:52,580 --> 00:21:54,520 for A then B speaks for A. 427 00:21:54,520 --> 00:22:00,020 So this sounds like sort of an obvious statement. 428 00:22:00,020 --> 00:22:02,110 Saying that A says something means 429 00:22:02,110 --> 00:22:06,510 that if we hear A say something that 430 00:22:06,510 --> 00:22:11,620 means that A told us and we believed that this, in fact, 431 00:22:11,620 --> 00:22:13,560 was A who told us this thing. 432 00:22:13,560 --> 00:22:15,060 So, in order for A to say something, 433 00:22:15,060 --> 00:22:19,240 we have to actually believe this cannot be a message, 434 00:22:19,240 --> 00:22:22,460 for example, that we received over an untrusted network. 435 00:22:22,460 --> 00:22:25,400 This has to be I sat next to A in the hall 436 00:22:25,400 --> 00:22:29,130 and A actually said B speaks for me. 437 00:22:29,130 --> 00:22:33,910 In this case, B might be, for example, some key. 438 00:22:33,910 --> 00:22:39,820 So A might say my public key is kA. 439 00:22:39,820 --> 00:22:41,830 If that is true and, in fact, we believe 440 00:22:41,830 --> 00:22:44,300 that A is the person who said this to us then 441 00:22:44,300 --> 00:22:46,550 we might be able to infer from that 442 00:22:46,550 --> 00:22:50,676 that this key kA actually speaks for A. 443 00:22:50,676 --> 00:22:52,300 This is what the first rule is and this 444 00:22:52,300 --> 00:22:59,960 is the base case of you have a trusted communication 445 00:22:59,960 --> 00:23:01,550 channel with communicating with A 446 00:23:01,550 --> 00:23:03,470 and you can use that trusted communication 447 00:23:03,470 --> 00:23:07,760 channel to actually build up some belief about, for example, 448 00:23:07,760 --> 00:23:11,570 a key representing somebody. 449 00:23:11,570 --> 00:23:13,430 Now, the next rule is one that sort of says 450 00:23:13,430 --> 00:23:18,820 given that we have some belief about one of these things 451 00:23:18,820 --> 00:23:21,890 like, for example, if we know that B speaks for A, 452 00:23:21,890 --> 00:23:24,480 we know what A's public key is and we 453 00:23:24,480 --> 00:23:30,380 overhear B saying that A says X then we might believe, 454 00:23:30,380 --> 00:23:32,360 in fact, that A says X. 455 00:23:32,360 --> 00:23:34,060 A simple example of this is suppose 456 00:23:34,060 --> 00:23:42,170 we get a message that has been signed by this key kA. 457 00:23:42,170 --> 00:23:48,350 If we see that then we might, in fact, believe that kA actually 458 00:23:48,350 --> 00:23:52,940 says that A says m. 459 00:23:52,940 --> 00:23:54,780 And we might believe that kA says this 460 00:23:54,780 --> 00:24:00,450 because we believe that this sort of sign primitive 461 00:24:00,450 --> 00:24:02,430 actually means that if we see something that 462 00:24:02,430 --> 00:24:04,880 was signed with kA's key that, in fact, that thing was 463 00:24:04,880 --> 00:24:07,150 generated by kA. 464 00:24:07,150 --> 00:24:09,540 Nobody else could have generated a signed message 465 00:24:09,540 --> 00:24:12,170 that worked with kA. 466 00:24:12,170 --> 00:24:18,230 We can actually now receive a message from A 467 00:24:18,230 --> 00:24:21,870 that has been signed using kA and we can, in fact, 468 00:24:21,870 --> 00:24:24,592 believe that A said something. 469 00:24:24,592 --> 00:24:26,050 Now, the final rule that we have is 470 00:24:26,050 --> 00:24:33,150 sort of a transitivity rule which 471 00:24:33,150 --> 00:24:36,340 is a way of expanding this web of who is related to whom. 472 00:24:36,340 --> 00:24:41,030 This says if B speaks for A and A speaks for C then what 473 00:24:41,030 --> 00:24:43,240 we're going to believe is that B speaks for C. 474 00:24:43,240 --> 00:24:45,260 This is just a transitive relationship. 475 00:24:45,260 --> 00:24:48,990 For example, if we know that kA speaks for A 476 00:24:48,990 --> 00:24:53,380 and we definitely overhear B saying A speaks for B, 477 00:24:53,380 --> 00:24:58,490 if B delegates and says that A can speak for me then we 478 00:24:58,490 --> 00:25:01,909 might believe that key kA also speaks for B. 479 00:25:01,909 --> 00:25:03,700 This is a simple transitivity relationship. 480 00:25:06,460 --> 00:25:07,974 The idea with these rules is that we 481 00:25:07,974 --> 00:25:10,390 are going to be able to apply these rules any time we have 482 00:25:10,390 --> 00:25:12,556 a conversation to decide, again, whether we actually 483 00:25:12,556 --> 00:25:21,510 trust the people who we are communicating with. 484 00:25:21,510 --> 00:25:40,640 And you may have noticed, when I was talking about this, 485 00:25:40,640 --> 00:25:45,820 that there were a lot of statements 486 00:25:45,820 --> 00:25:50,410 like nobody can actually fake this protocol sign X. 487 00:25:50,410 --> 00:26:00,820 We believe that nobody could forge this message that 488 00:26:00,820 --> 00:26:02,580 was signed with kA for example. 489 00:26:02,580 --> 00:26:05,530 If we believe that that is true then we can say kA 490 00:26:05,530 --> 00:26:06,250 says this thing. 491 00:26:06,250 --> 00:26:08,000 This authentication logic is full of a lot 492 00:26:08,000 --> 00:26:09,550 of these sort of like if we believe 493 00:26:09,550 --> 00:26:16,960 this thing is true kinds of assumptions 494 00:26:16,960 --> 00:26:27,380 that we are going to be making. 495 00:26:27,380 --> 00:26:32,780 My example was kind of confusing. 496 00:26:32,780 --> 00:26:38,550 In this example, B is actually C. 497 00:26:38,550 --> 00:26:42,440 kA is B. 498 00:26:42,440 --> 00:26:52,620 It says if we know that A speaks for B 499 00:26:52,620 --> 00:26:55,410 then we might be willing to believe that kA actually 500 00:26:55,410 --> 00:26:58,784 speaks for B, given that kA speaks for A. 501 00:26:58,784 --> 00:27:00,950 So we have this transitivity kind of a relationship. 502 00:27:00,950 --> 00:27:03,355 Sorry that the notation was a little bit confusing. 503 00:27:07,740 --> 00:27:11,230 I agree. 504 00:27:11,230 --> 00:27:13,790 The challenge here is that we want 505 00:27:13,790 --> 00:27:17,300 to try and make, any time we're using this authentication 506 00:27:17,300 --> 00:27:19,180 logic, in order to actually determine 507 00:27:19,180 --> 00:27:21,154 that somebody definitely said something 508 00:27:21,154 --> 00:27:23,320 we are going to have to make some set of assumptions 509 00:27:23,320 --> 00:27:26,134 about what we trust and what we believe. 510 00:27:26,134 --> 00:27:27,800 And one of the things that is important, 511 00:27:27,800 --> 00:27:29,841 any time you are using this authentication logic, 512 00:27:29,841 --> 00:27:43,160 is to actually make these assumptions as explicit 513 00:27:43,160 --> 00:27:43,810 as you can. 514 00:27:43,810 --> 00:27:45,530 Let's look at a set of assumptions 515 00:27:45,530 --> 00:27:46,770 that we are making here. 516 00:27:46,770 --> 00:27:59,940 I said something like when I see a message that says sign m, kA. 517 00:27:59,940 --> 00:28:05,400 And then I infer from that that A says m, 518 00:28:05,400 --> 00:28:15,130 assuming that I already know that kA speaks for A, if I have 519 00:28:15,130 --> 00:28:20,560 something like this and I make this inference that A 520 00:28:20,560 --> 00:28:24,870 says m given that kA speaks for A then that's all a sign thing. 521 00:28:24,870 --> 00:28:27,840 I am making a set of assumptions when I do this. 522 00:28:27,840 --> 00:28:30,600 I am making a set of assumption, in particular, about what 523 00:28:30,600 --> 00:28:34,070 this sign protocol does. 524 00:28:34,070 --> 00:28:38,130 I am assuming, for example, that sign is actually 525 00:28:38,130 --> 00:28:39,520 secure in some way. 526 00:28:39,520 --> 00:28:47,610 I am assuming that this signature is not forgeable, 527 00:28:47,610 --> 00:28:51,600 that somebody else could not have generated 528 00:28:51,600 --> 00:28:54,120 a signature unless they actually had access, for example, 529 00:28:54,120 --> 00:28:58,800 to the private key or to the private key of A. 530 00:28:58,800 --> 00:29:01,290 This signature is not forgeable. 531 00:29:01,290 --> 00:29:06,450 I am also assuming, for example, that any private keys 532 00:29:06,450 --> 00:29:09,730 are actually private. 533 00:29:09,730 --> 00:29:27,170 I am assuming that, for example, A hasn't gone out 534 00:29:27,170 --> 00:29:30,290 and broadcasted the private key to everybody else in the world. 535 00:29:30,290 --> 00:29:32,980 Because if A had done that then I wouldn't actually 536 00:29:32,980 --> 00:29:35,140 know that A actually said m. 537 00:29:35,140 --> 00:29:37,380 It could have been anybody who said m. 538 00:29:37,380 --> 00:29:40,540 I would be in trouble if that were the case. 539 00:29:40,540 --> 00:29:43,550 Any time that I sort of take this and make this inference 540 00:29:43,550 --> 00:29:45,990 from seeing a sign like this then that 541 00:29:45,990 --> 00:29:48,620 suggests that I am sort of making these two assumptions. 542 00:29:48,620 --> 00:29:51,094 And it is not that these assumptions are wrong or bad, 543 00:29:51,094 --> 00:29:53,510 it is just that it is good to think about exclusively what 544 00:29:53,510 --> 00:29:54,870 they are. 545 00:29:54,870 --> 00:29:56,980 Assuming that the signature is forgeable well, 546 00:29:56,980 --> 00:29:59,610 we all know that cryptography can sometimes be broken. 547 00:29:59,610 --> 00:30:01,990 It is not that cryptography is totally foolproof, 548 00:30:01,990 --> 00:30:04,440 but we may have a relatively high confidence 549 00:30:04,440 --> 00:30:07,620 that this cryptography is going to be difficult for somebody 550 00:30:07,620 --> 00:30:10,070 to break. 551 00:30:10,070 --> 00:30:15,650 And, while assuming that private keys are actually private, 552 00:30:15,650 --> 00:30:17,820 we might sort of know A personally 553 00:30:17,820 --> 00:30:22,142 and believe that A is outright upstanding individual 554 00:30:22,142 --> 00:30:23,850 and we are going to trust that she hasn't 555 00:30:23,850 --> 00:30:26,730 gone and disseminated her key everywhere around the world 556 00:30:26,730 --> 00:30:28,810 because that wouldn't really be in her interest 557 00:30:28,810 --> 00:30:30,910 and we take her to be a trustworthy person. 558 00:30:30,910 --> 00:30:34,500 This is a bit of a leap of faith that we are making about this. 559 00:30:34,500 --> 00:30:36,510 But if we believe those two things then 560 00:30:36,510 --> 00:30:37,970 we can, in fact, infer that A says 561 00:30:37,970 --> 00:30:44,950 m when we see a message that is signed with kA. 562 00:30:44,950 --> 00:30:49,320 Let's look at another example that 563 00:30:49,320 --> 00:30:52,260 works with this sort of public key cryptography 564 00:30:52,260 --> 00:30:54,526 that we talked about in this example 565 00:30:54,526 --> 00:30:56,150 with this authentication protocol here. 566 00:31:00,050 --> 00:31:04,160 Suppose that A tells me my public key is kA pub 567 00:31:04,160 --> 00:31:07,850 and this is done over a secure communication 568 00:31:07,850 --> 00:31:10,490 channel like in-person communication, 569 00:31:10,490 --> 00:31:15,340 I might then infer that kA pub actually speaks for A. 570 00:31:15,340 --> 00:31:22,995 Now, suppose I see a message that 571 00:31:22,995 --> 00:31:24,870 has been signed with kA private, I might say, 572 00:31:24,870 --> 00:31:32,080 and this is an inference, that kA private says that A says m. 573 00:31:32,080 --> 00:31:35,760 The question we want to ask now is should we actually 574 00:31:35,760 --> 00:31:36,880 believe that A said m? 575 00:31:36,880 --> 00:31:39,130 I saw a message, I know what A's public key is 576 00:31:39,130 --> 00:31:42,640 and I trust A's public key maybe and I saw this sign thing. 577 00:31:42,640 --> 00:31:47,250 Now I need to think about do I believe now 578 00:31:47,250 --> 00:31:51,300 that A actually said m given that I saw a message that 579 00:31:51,300 --> 00:31:53,390 was signed with kA private? 580 00:31:53,390 --> 00:31:56,310 There is something obviously that we 581 00:31:56,310 --> 00:31:59,020 need to do which is that we need to verify this message. 582 00:31:59,020 --> 00:32:00,890 Suppose we go and we run this verify step 583 00:32:00,890 --> 00:32:05,140 on this message and this verify comes out to be true. 584 00:32:05,140 --> 00:32:08,410 I verify using kA pub that, in fact, this message 585 00:32:08,410 --> 00:32:10,250 was appropriately signed. 586 00:32:10,250 --> 00:32:15,680 Verify says yes, the signature on this message checks. 587 00:32:15,680 --> 00:32:22,200 Now should I believe that that A actually said m? 588 00:32:22,200 --> 00:32:23,070 That depends. 589 00:32:23,070 --> 00:32:24,980 There are a couple of conditions, 590 00:32:24,980 --> 00:32:27,792 again, sort of set of assumptions that we are making. 591 00:32:27,792 --> 00:32:30,250 One thing we are clearly doing is assuming that this crypto 592 00:32:30,250 --> 00:32:31,950 system is secure. 593 00:32:31,950 --> 00:32:41,420 We are also believing that this verification step actually 594 00:32:41,420 --> 00:32:46,690 gives us, that by taking something that has been signed 595 00:32:46,690 --> 00:32:48,650 with a private key and then verifying it 596 00:32:48,650 --> 00:32:52,030 with the public key that that is as good as A 597 00:32:52,030 --> 00:32:53,166 actually saying something. 598 00:32:53,166 --> 00:32:54,540 One thing clearly that depends on 599 00:32:54,540 --> 00:32:56,165 is that the crypto system is secure. 600 00:33:09,870 --> 00:33:17,320 There are a couple of other assumptions that we are making. 601 00:33:17,320 --> 00:33:19,320 One thing is that it definitely, again, 602 00:33:19,320 --> 00:33:23,320 requires that kA private has actually been kept secret. 603 00:33:23,320 --> 00:33:26,290 And, furthermore, it requires that A didn't lie 604 00:33:26,290 --> 00:33:27,930 to us about what kA public was. 605 00:33:27,930 --> 00:33:30,710 Because if A had given us a wrong kA public 606 00:33:30,710 --> 00:33:34,260 then somebody else could have signed this message 607 00:33:34,260 --> 00:33:36,680 and then we might try and verify the message 608 00:33:36,680 --> 00:33:38,860 using the lied about kA public and then we 609 00:33:38,860 --> 00:33:40,200 might be in trouble. 610 00:33:40,200 --> 00:33:42,910 Again, this is similar to this example with just kA 611 00:33:42,910 --> 00:33:47,570 but just showing how it works with public and private 612 00:33:47,570 --> 00:33:49,340 key encryption. 613 00:33:49,340 --> 00:33:53,597 It is this case that, again, we sort of need 614 00:33:53,597 --> 00:33:55,430 to always think about making our assumptions 615 00:33:55,430 --> 00:33:57,442 as explicit as possible. 616 00:33:57,442 --> 00:33:58,900 And that is really what it's about. 617 00:33:58,900 --> 00:34:00,430 Deciding that we trust, for example, 618 00:34:00,430 --> 00:34:03,700 the signature is true is analogous to deciding 619 00:34:03,700 --> 00:34:08,000 that we have to sort of trust that we trust the cryptography. 620 00:34:08,000 --> 00:34:10,639 It is possible to apply this authentication logic 621 00:34:10,639 --> 00:34:13,310 in other environments as well that aren't necessarily based 622 00:34:13,310 --> 00:34:15,366 on cryptography. 623 00:34:15,366 --> 00:34:17,949 For example, instead of trusting that the cryptographic system 624 00:34:17,949 --> 00:34:20,960 works, I might trust that a particular communication 625 00:34:20,960 --> 00:34:22,780 channel is secure. 626 00:34:22,780 --> 00:34:25,280 For example, if I pick up the telephone 627 00:34:25,280 --> 00:34:30,070 and call you and give you some message, 628 00:34:30,070 --> 00:34:33,339 you might trust that it's me because, for example, you 629 00:34:33,339 --> 00:34:35,380 believe that it wouldn't be possible for somebody 630 00:34:35,380 --> 00:34:36,159 to fake my voice. 631 00:34:36,159 --> 00:34:37,867 And you recognize my voice because you've 632 00:34:37,867 --> 00:34:39,670 come to lecture so many times. 633 00:34:39,670 --> 00:34:44,010 That would be an example of a way in which you might decide 634 00:34:44,010 --> 00:34:46,040 that you trust that something is actually true, 635 00:34:46,040 --> 00:34:48,929 and that would be an assumption that you're 636 00:34:48,929 --> 00:34:53,960 making about somebody not being able to fake my voice. 637 00:34:53,960 --> 00:34:58,410 The point of authentication logic 638 00:34:58,410 --> 00:35:01,170 is that it gives us a way to kind 639 00:35:01,170 --> 00:35:05,020 of reason about these kinds of deductions about 640 00:35:05,020 --> 00:35:07,020 whether or not we actually believe that somebody 641 00:35:07,020 --> 00:35:12,900 said something was true or not. 642 00:35:12,900 --> 00:35:15,210 That is fine. 643 00:35:15,210 --> 00:35:17,140 We still, though, at least when we're 644 00:35:17,140 --> 00:35:19,010 using public and private key cryptography, 645 00:35:19,010 --> 00:35:21,400 haven't exactly answered the question 646 00:35:21,400 --> 00:35:27,760 of how we go about establishing this initial trust step. 647 00:35:27,760 --> 00:35:30,670 We said we might establish initial trust by -- 648 00:35:30,670 --> 00:35:35,790 At the bottom of this is we have to actually physically 649 00:35:35,790 --> 00:35:38,310 overhear somebody say something over a secure communication 650 00:35:38,310 --> 00:35:39,060 channel, we think. 651 00:35:39,060 --> 00:35:44,280 We have to have some way to getting this first rule where 652 00:35:44,280 --> 00:35:49,290 A says that my, for example, public key is something 653 00:35:49,290 --> 00:35:52,015 so we can learn A's public key so that then we can bootstrap 654 00:35:52,015 --> 00:35:52,890 communication with A. 655 00:35:52,890 --> 00:35:56,350 We haven't yet answered the question of how we actually 656 00:35:56,350 --> 00:36:01,360 exchange these keys except for by this one method 657 00:36:01,360 --> 00:36:03,720 that we talked about, this web of trust method. 658 00:36:03,720 --> 00:36:10,050 There is this question about establishing 659 00:36:10,050 --> 00:36:15,680 some initial trust. 660 00:36:15,680 --> 00:36:18,010 One way we might do this is using 661 00:36:18,010 --> 00:36:24,330 this web of trust approach where I meet a friend, 662 00:36:24,330 --> 00:36:28,480 that friend tells me about some people who he or she trusts, 663 00:36:28,480 --> 00:36:32,875 and then those friends' friends' in term know a few people 664 00:36:32,875 --> 00:36:35,250 and eventually we establish communication with everybody. 665 00:36:35,250 --> 00:36:37,710 There are some computer systems that work this way. 666 00:36:37,710 --> 00:36:39,620 In particular, there is a system called 667 00:36:39,620 --> 00:36:41,250 PGP for pretty good privacy which 668 00:36:41,250 --> 00:36:43,760 is an email-based privacy encryption 669 00:36:43,760 --> 00:36:45,610 system that works this way. 670 00:36:45,610 --> 00:36:48,740 And there are, in fact, websites where sort of the idea 671 00:36:48,740 --> 00:36:51,470 is you find somebody else who has used this system, 672 00:36:51,470 --> 00:36:54,620 you share and you exchange public keys with them 673 00:36:54,620 --> 00:36:56,510 and then, once you have a few public keys, 674 00:36:56,510 --> 00:36:59,580 you can sort of build out this web of trust to everybody else 675 00:36:59,580 --> 00:37:04,550 by trusting people who your friends trust. 676 00:37:04,550 --> 00:37:07,850 The problem is, one, clearly the Internet doesn't work this way 677 00:37:07,850 --> 00:37:09,585 so you don't have this kind of system 678 00:37:09,585 --> 00:37:10,960 that you're using on the Internet 679 00:37:10,960 --> 00:37:12,570 and, two, these kinds of systems are 680 00:37:12,570 --> 00:37:14,260 very difficult to set up and maintain 681 00:37:14,260 --> 00:37:16,593 and there ends up being some centralized administration. 682 00:37:16,593 --> 00:37:20,960 You end up with having to have some sort of way, 683 00:37:20,960 --> 00:37:23,360 it becomes very difficult to actually discriminate 684 00:37:23,360 --> 00:37:27,760 these kinds of public keys in these kinds of systems. 685 00:37:27,760 --> 00:37:30,900 The way this works is there is a centralized website. 686 00:37:30,900 --> 00:37:33,660 There is a website for PGP that is a giant database of keys 687 00:37:33,660 --> 00:37:36,130 and you sort of try and discover people's keys 688 00:37:36,130 --> 00:37:37,950 by locking this web of trust, but it 689 00:37:37,950 --> 00:37:40,912 tends to be difficult to use. 690 00:37:40,912 --> 00:37:43,120 Let's talk a little bit about how this actually works 691 00:37:43,120 --> 00:37:49,450 in the context of the Internet. 692 00:37:49,450 --> 00:37:51,710 Suppose there is actually some person, P, 693 00:37:51,710 --> 00:37:53,600 who is sitting in front of this browser, 694 00:37:53,600 --> 00:37:55,460 and there are some questions that we might 695 00:37:55,460 --> 00:37:58,560 want to ask when we're thinking about deciding whether or not 696 00:37:58,560 --> 00:38:01,530 we trust this person, P. One question 697 00:38:01,530 --> 00:38:12,860 that the Web server might ask is does W know or trust person P, 698 00:38:12,860 --> 00:38:16,440 or how does it? 699 00:38:16,440 --> 00:38:24,080 This is a commercial website like Amazon dot com. 700 00:38:24,080 --> 00:38:28,180 Actually, the trust story turns out to not be that complicated. 701 00:38:28,180 --> 00:38:30,650 Amazon dot com doesn't really care 702 00:38:30,650 --> 00:38:32,150 that you are who you claim to be, 703 00:38:32,150 --> 00:38:35,520 as long as you give them the money that they want. 704 00:38:35,520 --> 00:38:38,110 You can go on and tell them that you are Oscar the Grouch, 705 00:38:38,110 --> 00:38:40,140 they don't care, as long as you give them 706 00:38:40,140 --> 00:38:42,846 the money that they want for whatever it is that you order. 707 00:38:42,846 --> 00:38:44,220 That is all that matters to them. 708 00:38:44,220 --> 00:38:47,170 So, in fact, what Amazon does is they don't actually 709 00:38:47,170 --> 00:38:50,560 know who you are exactly but they might decide that they 710 00:38:50,560 --> 00:38:56,250 trust you because the credit card number is good 711 00:38:56,250 --> 00:39:01,660 and is verified by the credit card company. 712 00:39:01,660 --> 00:39:04,160 When they try and charge your credit card they get the money 713 00:39:04,160 --> 00:39:05,100 and they are happy. 714 00:39:05,100 --> 00:39:07,380 And there is this other question, though, 715 00:39:07,380 --> 00:39:14,520 about how does P trust W? 716 00:39:14,520 --> 00:39:22,240 And the way that this works on the Internet 717 00:39:22,240 --> 00:39:25,871 is basically like the way that this authentication 718 00:39:25,871 --> 00:39:27,620 protocol that I have shown you here works, 719 00:39:27,620 --> 00:39:31,120 this is how SSL works. 720 00:39:31,120 --> 00:39:39,497 And it turns out to be very subtle. 721 00:39:39,497 --> 00:39:41,580 It can be quite hard to actually convince yourself 722 00:39:41,580 --> 00:39:43,820 that you should trust a given website if you actually 723 00:39:43,820 --> 00:39:47,160 start grilling down on it. 724 00:39:47,160 --> 00:39:49,522 The way that it is going to decide that it trusts W 725 00:39:49,522 --> 00:39:51,480 is by checking with some certificate authority. 726 00:39:51,480 --> 00:39:53,790 That is how it works on the Internet. 727 00:39:53,790 --> 00:39:55,480 And there are some issues with it, 728 00:39:55,480 --> 00:39:57,370 which is how does the certificate 729 00:39:57,370 --> 00:40:03,934 authority authenticate W? 730 00:40:03,934 --> 00:40:05,350 How does the certificate authority 731 00:40:05,350 --> 00:40:12,280 decide what W's key was and how do they securely exchange keys? 732 00:40:12,280 --> 00:40:15,480 How did the user get the CA's public key? 733 00:40:18,140 --> 00:40:25,940 And then what if somebody's private keys are stolen? 734 00:40:25,940 --> 00:40:31,570 For example, W or the certificate authority's 735 00:40:31,570 --> 00:40:33,140 private keys are stolen. 736 00:40:33,140 --> 00:40:35,485 What I want to do is just sort of show you, 737 00:40:35,485 --> 00:40:37,860 at a high level, how this actually works on the Internet. 738 00:40:37,860 --> 00:40:39,443 I thought it might be instructive just 739 00:40:39,443 --> 00:40:41,560 to see an example of what is actually going on, 740 00:40:41,560 --> 00:40:43,940 on the Internet, and to kind of illustrate 741 00:40:43,940 --> 00:40:46,150 how some of these issues are and are not properly 742 00:40:46,150 --> 00:40:48,970 addressed by the current sort of SSL-based Internet 743 00:40:48,970 --> 00:40:50,900 architecture. 744 00:40:50,900 --> 00:40:52,660 I have a couple of websites open here. 745 00:40:52,660 --> 00:40:54,490 I have an Amazon dot com website and I 746 00:40:54,490 --> 00:40:57,340 have, in this case, an Apple developer website, 747 00:40:57,340 --> 00:41:01,270 both of which you will notice have HTTPS URLs associated 748 00:41:01,270 --> 00:41:02,420 with them. 749 00:41:02,420 --> 00:41:06,340 In this browser, if I click on this little secure icon here, 750 00:41:06,340 --> 00:41:08,340 almost any browser would have a similar sort 751 00:41:08,340 --> 00:41:11,830 of feature, a little lock somewhere that when 752 00:41:11,830 --> 00:41:14,300 you are viewing an HTTPS page. 753 00:41:14,300 --> 00:41:20,900 What I see is something that lists 754 00:41:20,900 --> 00:41:23,080 how it was that this site actually 755 00:41:23,080 --> 00:41:26,830 decided that this was a secure site that it should communicate 756 00:41:26,830 --> 00:41:27,330 with. 757 00:41:27,330 --> 00:41:28,870 When that little lock appears it means 758 00:41:28,870 --> 00:41:30,350 that my browser has actually decided 759 00:41:30,350 --> 00:41:31,410 that it trusts this site. 760 00:41:31,410 --> 00:41:33,440 We are at the point where we have already 761 00:41:33,440 --> 00:41:35,300 said we trust this site. 762 00:41:35,300 --> 00:41:37,440 And the reason that we trust this site, it says, 763 00:41:37,440 --> 00:41:43,150 is because this site is WWW dot Amazon dot com 764 00:41:43,150 --> 00:41:48,670 and we have this certificate for this site that 765 00:41:48,670 --> 00:41:53,190 was issued by RSA Data Security Incorporated. 766 00:41:53,190 --> 00:41:56,860 You might say who is RSA Data Security Incorporated? 767 00:41:56,860 --> 00:41:58,360 I have never heard of these people. 768 00:41:58,360 --> 00:42:01,280 I have never exchanged any information with them. 769 00:42:01,280 --> 00:42:03,000 But they are the certificate authority. 770 00:42:03,000 --> 00:42:05,350 That is the person who signed this thing. 771 00:42:05,350 --> 00:42:07,970 I can look and sort of drill down on this. 772 00:42:07,970 --> 00:42:10,340 And what you will see is some information 773 00:42:10,340 --> 00:42:11,760 about Amazon dot com. 774 00:42:11,760 --> 00:42:14,590 This is the actual certificate itself. 775 00:42:14,590 --> 00:42:18,400 This is some information about the certificate authority, 776 00:42:18,400 --> 00:42:20,220 including information about what algorithm 777 00:42:20,220 --> 00:42:22,070 the certificate authority has used 778 00:42:22,070 --> 00:42:26,100 to generate this certificate. 779 00:42:26,100 --> 00:42:29,180 And then now what we have is this public key information, 780 00:42:29,180 --> 00:42:32,590 the actual public key that corresponds to Amazon dot com. 781 00:42:32,590 --> 00:42:34,360 The certificate contains Amazon dot 782 00:42:34,360 --> 00:42:36,240 com's public key that was encrypted 783 00:42:36,240 --> 00:42:38,050 using the RSA algorithm. 784 00:42:38,050 --> 00:42:41,900 And then there also is some information, this signature 785 00:42:41,900 --> 00:42:43,350 that goes with this public key. 786 00:42:43,350 --> 00:42:48,580 This is the signature that was signed by this RSA corporation. 787 00:42:51,600 --> 00:42:53,520 If I go to a different site like, for example, 788 00:42:53,520 --> 00:42:56,040 this Apple site over here, we see 789 00:42:56,040 --> 00:42:59,400 that this thing was actually signed by somebody else. 790 00:42:59,400 --> 00:43:02,150 This certificate is signed by somebody 791 00:43:02,150 --> 00:43:03,690 called VeriSign Incorporated. 792 00:43:03,690 --> 00:43:06,470 Again, you have no idea who VeriSign Incorporated is, 793 00:43:06,470 --> 00:43:09,111 but you will notice that these are two different certificate 794 00:43:09,111 --> 00:43:09,610 authorities. 795 00:43:09,610 --> 00:43:12,130 It is not the case that there is one master certificate 796 00:43:12,130 --> 00:43:13,796 authority out there on the Internet that 797 00:43:13,796 --> 00:43:16,700 is in charge of everything. 798 00:43:16,700 --> 00:43:21,040 On a Mac anyway, the same is true of most other operating 799 00:43:21,040 --> 00:43:23,970 systems, there is a way that we can 800 00:43:23,970 --> 00:43:27,130 go and look at the actual list of all the people 801 00:43:27,130 --> 00:43:28,679 who we trust are. 802 00:43:28,679 --> 00:43:30,470 This is sort of an instructive thing to do. 803 00:43:30,470 --> 00:43:32,980 To go and see who it is that you actually trust that 804 00:43:32,980 --> 00:43:33,870 is on this computer. 805 00:43:33,870 --> 00:43:39,960 I can click on this certificate over here. 806 00:43:39,960 --> 00:43:44,760 This is a list of all the certificates on this machine 807 00:43:44,760 --> 00:43:49,280 that I have and where I would accept things from. 808 00:43:49,280 --> 00:43:52,760 You can see there are a very large number of certificates 809 00:43:52,760 --> 00:43:54,120 that this machine has installed. 810 00:43:54,120 --> 00:43:57,070 These are not just certificates but actual certificate 811 00:43:57,070 --> 00:43:57,570 authorities. 812 00:43:57,570 --> 00:43:59,630 People who I would actually trust 813 00:43:59,630 --> 00:44:11,150 to sign a communication with a given site to provide me 814 00:44:11,150 --> 00:44:19,890 with a certificate for, say, some Web server 815 00:44:19,890 --> 00:44:21,950 that I want to communicate with. 816 00:44:21,950 --> 00:44:24,835 Now the question is so where did these things come from? 817 00:44:24,835 --> 00:44:26,210 They came from one of two places. 818 00:44:26,210 --> 00:44:28,514 Either I added them to the system myself. 819 00:44:28,514 --> 00:44:29,930 In this case, I have, for example, 820 00:44:29,930 --> 00:44:32,310 an MIT certification authority. 821 00:44:32,310 --> 00:44:34,340 If any of you have ever used MIT certificates 822 00:44:34,340 --> 00:44:38,530 like when you're logging onto Websys, 823 00:44:38,530 --> 00:44:40,750 in order to be able to access that system 824 00:44:40,750 --> 00:44:42,410 you had to add support for a given 825 00:44:42,410 --> 00:44:45,080 certificate to your browser. 826 00:44:45,080 --> 00:44:48,370 This MIT certification authority is a certificate 827 00:44:48,370 --> 00:44:53,350 that came from MIT and this keychain thing 828 00:44:53,350 --> 00:44:54,840 which shows me on my certificates 829 00:44:54,840 --> 00:44:56,830 has this funny sort of message that 830 00:44:56,830 --> 00:44:59,720 says this certificate is not in the trusted root database. 831 00:44:59,720 --> 00:45:07,110 The reason it says that is, this is a certificate for the MIT 832 00:45:07,110 --> 00:45:09,500 certification authority, I just got this certificate 833 00:45:09,500 --> 00:45:12,330 for the certification authority off the Web somewhere. 834 00:45:12,330 --> 00:45:13,890 And I didn't do something that it 835 00:45:13,890 --> 00:45:16,790 wanted me to do to actually verify that this, in fact, was 836 00:45:16,790 --> 00:45:19,176 a legitimate certificate from the certificate authority. 837 00:45:19,176 --> 00:45:21,050 Somebody could have put this thing on the Web 838 00:45:21,050 --> 00:45:23,660 and said this is a certificate for the MIT certificate 839 00:45:23,660 --> 00:45:27,820 authority and you should use it to discover certificates 840 00:45:27,820 --> 00:45:30,230 for anybody who claims to be coming from MIT dot edu. 841 00:45:30,230 --> 00:45:33,740 They could have lied about it and then I would be in trouble. 842 00:45:33,740 --> 00:45:37,260 It is possible that this is a fake certificate for the MIT 843 00:45:37,260 --> 00:45:41,270 certificate authority, but I sort of 844 00:45:41,270 --> 00:45:43,500 have made this trust, this assumption that it is not 845 00:45:43,500 --> 00:45:45,125 because I downloaded it from a website. 846 00:45:45,125 --> 00:45:47,980 And I believe that, in fact, this is a valid certificate. 847 00:45:47,980 --> 00:45:51,050 The other things you see here are this large number 848 00:45:51,050 --> 00:45:52,550 of certificates. 849 00:45:52,550 --> 00:45:54,350 For example, one of the ones that we see, 850 00:45:54,350 --> 00:46:00,810 let's see if we can find RSA here, this RSA security. 851 00:46:00,810 --> 00:46:03,010 Here I see RSA security. 852 00:46:03,010 --> 00:46:05,977 And this thing actually says this certificate is valid. 853 00:46:05,977 --> 00:46:07,560 Now there is this question about where 854 00:46:07,560 --> 00:46:09,660 did RSA security come from? 855 00:46:09,660 --> 00:46:12,280 And the way that this works is that this browser 856 00:46:12,280 --> 00:46:14,590 and, in this case, the operating system 857 00:46:14,590 --> 00:46:17,330 itself has actually been shipped with a bunch of certificates 858 00:46:17,330 --> 00:46:20,480 from a bunch of different certificate authorities. 859 00:46:20,480 --> 00:46:23,330 And so these have been installed by the operating system 860 00:46:23,330 --> 00:46:24,070 manufacturer. 861 00:46:24,070 --> 00:46:26,852 And I am saying I trust Apple to have 862 00:46:26,852 --> 00:46:28,310 gotten the appropriate certificates 863 00:46:28,310 --> 00:46:30,309 and loaded them appropriately into the computer. 864 00:46:30,309 --> 00:46:32,930 And any time you download a browser basically 865 00:46:32,930 --> 00:46:36,310 that browser already has a set of certificates 866 00:46:36,310 --> 00:46:37,810 for a set of certificate authorities 867 00:46:37,810 --> 00:46:39,210 already installed in it. 868 00:46:39,210 --> 00:46:41,100 And that is where all of these base certificate authorities 869 00:46:41,100 --> 00:46:41,520 come from. 870 00:46:41,520 --> 00:46:43,750 You can see that when you are using one of these things 871 00:46:43,750 --> 00:46:45,583 you have already placed quite a bit of trust 872 00:46:45,583 --> 00:46:48,080 and made a number of assumptions about who you trust 873 00:46:48,080 --> 00:46:50,570 and what you should trust just by using this system. 874 00:46:50,570 --> 00:46:54,615 We have answered this question now 875 00:46:54,615 --> 00:46:57,240 about where did the user get the public key for the certificate 876 00:46:57,240 --> 00:46:58,060 authority? 877 00:46:58,060 --> 00:47:00,710 Either they explicitly added it by downloading it 878 00:47:00,710 --> 00:47:03,612 from a website, for example, or it came with the browser. 879 00:47:03,612 --> 00:47:06,070 What about this question how does the certificate authority 880 00:47:06,070 --> 00:47:07,670 authenticate an actual website? 881 00:47:07,670 --> 00:47:12,074 How does Amazon dot com get a certificate from VeriSign? 882 00:47:12,074 --> 00:47:13,490 The answer to this turns out to be 883 00:47:13,490 --> 00:47:17,380 that Amazon dot com probably paid VeriSign some money. 884 00:47:17,380 --> 00:47:21,590 And it is not even clear you can say that, 885 00:47:21,590 --> 00:47:24,020 but in the case of VeriSign you are probably 886 00:47:24,020 --> 00:47:26,380 pretty sure that Amazon dot com paid them some money. 887 00:47:26,380 --> 00:47:28,814 And you can probably go to the VeriSign website 888 00:47:28,814 --> 00:47:30,230 and see a list of things that they 889 00:47:30,230 --> 00:47:34,260 claimed to have done in sort of authenticating Amazon dot com. 890 00:47:34,260 --> 00:47:36,245 They may have forced Amazon dot com 891 00:47:36,245 --> 00:47:38,370 to present them with some information that actually 892 00:47:38,370 --> 00:47:41,310 suggests that they are the company Amazon dot com 893 00:47:41,310 --> 00:47:43,810 or they own some right to the name Amazon dot com 894 00:47:43,810 --> 00:47:47,520 or that they are incorporated under the name Amazon dot com. 895 00:47:47,520 --> 00:47:49,380 Basically, the certificate authority 896 00:47:49,380 --> 00:47:53,080 has some protocol that it uses to authenticate W. 897 00:47:53,080 --> 00:47:55,910 And we don't know what that protocol is, 898 00:47:55,910 --> 00:47:57,722 but you're sort of implicitly trusting 899 00:47:57,722 --> 00:47:59,930 that all these certificate authorities that Apple has 900 00:47:59,930 --> 00:48:01,700 already approved for you, in fact, 901 00:48:01,700 --> 00:48:03,900 use some process that we will be reasonably assured 902 00:48:03,900 --> 00:48:06,880 of guarantying your privacy or guarantying 903 00:48:06,880 --> 00:48:08,960 that the websites that you are connecting to 904 00:48:08,960 --> 00:48:11,240 are, in fact, valid websites that you 905 00:48:11,240 --> 00:48:14,040 should feel comfortable giving your credit card number to. 906 00:48:14,040 --> 00:48:16,120 You can see that this is a little bit dicey. 907 00:48:16,120 --> 00:48:18,980 If feels a little bit like it is not clear 908 00:48:18,980 --> 00:48:21,002 that this is incredibly secure. 909 00:48:21,002 --> 00:48:22,460 And then the last question is well, 910 00:48:22,460 --> 00:48:24,320 what if your private keys are stolen? 911 00:48:24,320 --> 00:48:27,940 If your private keys are stolen in this particular example, 912 00:48:27,940 --> 00:48:30,760 the certificates that you have from Amazon dot com, 913 00:48:30,760 --> 00:48:34,035 if Amazon dot com's private keys are stolen 914 00:48:34,035 --> 00:48:36,160 and one of the certificate authority's private keys 915 00:48:36,160 --> 00:48:38,087 are stolen then you are in trouble. 916 00:48:38,087 --> 00:48:40,170 These certificates typically have expiration dates 917 00:48:40,170 --> 00:48:42,170 associated with them, but these expiration dates 918 00:48:42,170 --> 00:48:46,170 are often like a year or more into the future. 919 00:48:46,170 --> 00:48:48,740 And so you are going to continue to trust that certificate 920 00:48:48,740 --> 00:48:52,050 until this certificate expires in the future when you 921 00:48:52,050 --> 00:48:53,310 will go try and get a new one. 922 00:48:53,310 --> 00:48:56,150 For that whole time that you trust that certificate somebody 923 00:48:56,150 --> 00:48:58,720 who had access to the private keys from that certificate 924 00:48:58,720 --> 00:49:00,220 authority or from that website would 925 00:49:00,220 --> 00:49:04,610 be able to pretend to be that certificate 926 00:49:04,610 --> 00:49:06,032 authority or that website. 927 00:49:06,032 --> 00:49:07,240 So that might be problematic. 928 00:49:07,240 --> 00:49:09,830 That is sort of the end of this little discussion about how 929 00:49:09,830 --> 00:49:11,120 certificate authorities work. 930 00:49:11,120 --> 00:49:13,350 I hope you can see that this authentication logic is sort 931 00:49:13,350 --> 00:49:15,690 of a way that we can start to get at thinking about what 932 00:49:15,690 --> 00:49:17,356 the trust assumptions that we are making 933 00:49:17,356 --> 00:49:20,130 are when we are using these systems. 934 00:49:20,130 --> 00:49:23,510 This wraps up our discussion of security. 935 00:49:23,510 --> 00:49:28,327 We have one class session last. 936 00:49:28,327 --> 00:49:29,910 And the last class session is actually 937 00:49:29,910 --> 00:49:30,910 going to be a guest lecturer. 938 00:49:30,910 --> 00:49:32,326 It is going to be somebody talking 939 00:49:32,326 --> 00:49:36,064 about how law and computers and technology interact 940 00:49:36,064 --> 00:49:36,730 with each other. 941 00:49:36,730 --> 00:49:38,190 We are kind of step back and talk 942 00:49:38,190 --> 00:49:41,960 much more about high-level pictures like policy and law 943 00:49:41,960 --> 00:49:46,140 next time, and hopefully it will give you a good wrap-up 944 00:49:46,140 --> 00:49:47,620 for the whole class of 6.033. 945 00:49:47,620 --> 00:49:49,060 We will see you next time. 946 00:49:49,060 --> 00:49:50,534 Please come to class because there 947 00:49:50,534 --> 00:49:52,700 will be questions on the guest lecturer on the exam, 948 00:49:52,700 --> 00:49:54,970 so you want to make sure you don't miss it.