1 00:00:00,530 --> 00:00:02,960 The following content is provided under a Creative 2 00:00:02,960 --> 00:00:04,370 Commons license. 3 00:00:04,370 --> 00:00:07,410 Your support will help MIT OpenCourseWare continue to 4 00:00:07,410 --> 00:00:11,060 offer high quality educational resources for free. 5 00:00:11,060 --> 00:00:13,960 To make a donation or view additional materials from 6 00:00:13,960 --> 00:00:19,790 hundreds of MIT courses, visit MIT OpenCourseWare at 7 00:00:19,790 --> 00:00:22,785 ocw.mit.edu 8 00:00:22,785 --> 00:00:25,970 ROBERT GALLAGER: OK so today we're going to review a little 9 00:00:25,970 --> 00:00:27,820 bit Little's theorem-- 10 00:00:27,820 --> 00:00:30,350 we're going to review it a little bit, but say a few new 11 00:00:30,350 --> 00:00:31,710 things about it. 12 00:00:31,710 --> 00:00:35,120 I want to say something about Markov chains and renewal 13 00:00:35,120 --> 00:00:39,130 processes, because one of the most valuable things about 14 00:00:39,130 --> 00:00:43,810 understanding both is that you can use renewal theory to 15 00:00:43,810 --> 00:00:47,560 solve an extraordinary number of Markov chain problems, and 16 00:00:47,560 --> 00:00:51,210 you can use Markov chains to solve an awful 17 00:00:51,210 --> 00:00:53,310 lot of renewal problems. 18 00:00:53,310 --> 00:00:56,980 And I want to make that clear today because it's a trick 19 00:00:56,980 --> 00:01:02,070 that you have perhaps seen in the homework, or perhaps 20 00:01:02,070 --> 00:01:02,770 you've missed it. 21 00:01:02,770 --> 00:01:05,670 If you missed it you've probably done a lot of extra 22 00:01:05,670 --> 00:01:09,010 work that you wouldn't have had to do otherwise. 23 00:01:09,010 --> 00:01:11,320 But it's a very, useful thing. 24 00:01:13,890 --> 00:01:16,910 So it's worth understanding it. 25 00:01:16,910 --> 00:01:19,340 Finally we'll talk a little bit about delayed renewal 26 00:01:19,340 --> 00:01:22,020 processes at the end. 27 00:01:22,020 --> 00:01:25,210 What we will say essentially is there's a long section on 28 00:01:25,210 --> 00:01:28,280 delayed renewal processes, which goes through and does 29 00:01:28,280 --> 00:01:32,840 everything we did for ordinary renewal processes, and as far 30 00:01:32,840 --> 00:01:35,990 as almost all the asymptotic properties are concerned, 31 00:01:35,990 --> 00:01:37,980 they're exactly the same. 32 00:01:37,980 --> 00:01:42,100 It's just modifying a few of the ideas a little bit. 33 00:01:42,100 --> 00:01:45,620 The essential thing there is that when you're looking at 34 00:01:45,620 --> 00:01:48,854 something asymptotically, and the limit as t goes to 35 00:01:48,854 --> 00:01:53,780 infinity, what happens in that first little burst of time 36 00:01:53,780 --> 00:01:56,330 doesn't really make much difference anymore. 37 00:01:56,330 --> 00:02:02,810 So we will talk about that if we get that far. 38 00:02:02,810 --> 00:02:06,770 OK, one of the main reasons why convergence with 39 00:02:06,770 --> 00:02:10,970 probability 1 is so important, you've probably wondered why 40 00:02:10,970 --> 00:02:15,020 we're spending so much time talking about this, why we use 41 00:02:15,020 --> 00:02:16,690 it so often? 42 00:02:16,690 --> 00:02:22,630 I would like you get some idea of why it is often much easier 43 00:02:22,630 --> 00:02:27,850 to use than the convergence of probability. 44 00:02:27,850 --> 00:02:31,420 So I'm going to give you two examples of that. 45 00:02:31,420 --> 00:02:35,430 One of them is this initial thing which we talked about in 46 00:02:35,430 --> 00:02:39,760 the notes also and I talked about on lecture before. 47 00:02:39,760 --> 00:02:43,440 There's this nice theorem which says that if the 48 00:02:43,440 --> 00:02:48,460 sequence of random variables converges to some number-- 49 00:02:48,460 --> 00:02:49,470 alpha-- 50 00:02:49,470 --> 00:02:55,320 with probability 1, and if f of x is a real valued function 51 00:02:55,320 --> 00:02:58,110 of a real variable, that's continuous, 52 00:02:58,110 --> 00:02:59,290 that x equals alpha. 53 00:02:59,290 --> 00:03:02,360 In other words, as you start converging, as you get close 54 00:03:02,360 --> 00:03:05,830 to alpha, this function is continuous there. 55 00:03:05,830 --> 00:03:08,700 And since it's continuous there, you have to get closer 56 00:03:08,700 --> 00:03:09,410 and closer. 57 00:03:09,410 --> 00:03:11,550 That's the essence of that. 58 00:03:11,550 --> 00:03:15,930 So It says that then it's function of zn, a function of 59 00:03:15,930 --> 00:03:19,320 a random variable, a real valued function of a random 60 00:03:19,320 --> 00:03:22,420 variable is also a random variable. 61 00:03:22,420 --> 00:03:26,270 It converges with probability 1 to f of alpha. 62 00:03:26,270 --> 00:03:31,790 That was the thing we use to get the strong law for 63 00:03:31,790 --> 00:03:36,980 renewals, which says it's a probability that the limit as 64 00:03:36,980 --> 00:03:43,740 t goes to infinity of n of t of omega over t is equal to 1 65 00:03:43,740 --> 00:03:46,340 over x-bar with probability 1. 66 00:03:46,340 --> 00:03:48,940 In other words, the probability of this set of 67 00:03:48,940 --> 00:03:53,950 sample points for which this limit exists is equal to 1 68 00:03:53,950 --> 00:03:57,500 anytime you get confused by one of these statements, that 69 00:03:57,500 --> 00:04:01,760 says with probability 1, and by now you're probably writing 70 00:04:01,760 --> 00:04:07,400 this just as an add-on at the end, and you often forget that 71 00:04:07,400 --> 00:04:10,750 there's an awful lot tucked into that statement. 72 00:04:10,750 --> 00:04:14,580 And I tried to put a little of it there. 73 00:04:14,580 --> 00:04:17,529 Initially when we talked about it we put more in, saying the 74 00:04:17,529 --> 00:04:20,950 probability of the set of omega such that this limit 75 00:04:20,950 --> 00:04:22,800 exists is equal to 1. 76 00:04:22,800 --> 00:04:25,620 We state it in all sorts of different ways. 77 00:04:25,620 --> 00:04:29,060 But always go back, and think a little bit about what it's 78 00:04:29,060 --> 00:04:30,080 really saying. 79 00:04:30,080 --> 00:04:32,240 Random variables are not like numbers. 80 00:04:32,240 --> 00:04:35,830 Random variables are far more complicated things. 81 00:04:35,830 --> 00:04:38,270 Because of that they have many more ways they 82 00:04:38,270 --> 00:04:40,080 can approach limits. 83 00:04:40,080 --> 00:04:44,290 They have many more peculiar features about them. 84 00:04:44,290 --> 00:04:48,630 But anyway, the fact that this theorem holds true it is a 85 00:04:48,630 --> 00:04:53,280 result of a little bit of monkeying around with n 86 00:04:53,280 --> 00:04:58,320 divided by the sum of n random variables, and associating 87 00:04:58,320 --> 00:05:01,910 that with an n of t over t. 88 00:05:01,910 --> 00:05:04,330 But it's also associated with this function here. 89 00:05:04,330 --> 00:05:06,190 So you have the two things. 90 00:05:06,190 --> 00:05:10,770 The thing which is difficult conceptually is this one here. 91 00:05:10,770 --> 00:05:16,320 So that's one place where we used the strong law, where if 92 00:05:16,320 --> 00:05:21,950 we try to state a weak law of large numbers for renewals, 93 00:05:21,950 --> 00:05:25,400 without being able to go from this strong law to the weak 94 00:05:25,400 --> 00:05:27,800 law, it'd really be quite hard to prove it. 95 00:05:27,800 --> 00:05:30,310 You can sit down and try to prove it if you want to, and I 96 00:05:30,310 --> 00:05:35,750 think you'll see that it really isn't very easy. 97 00:05:35,750 --> 00:05:41,260 So strong law of renewals also holds if the expected value of 98 00:05:41,260 --> 00:05:43,840 x is equal to infinity. 99 00:05:43,840 --> 00:05:50,330 In this case, understanding why this is true really 100 00:05:50,330 --> 00:05:54,400 requires you to think pretty deeply about random 101 00:05:54,400 --> 00:05:55,000 variables-- 102 00:05:55,000 --> 00:05:56,640 and have an infinite expectation 103 00:05:56,640 --> 00:05:57,730 of what that means. 104 00:05:57,730 --> 00:06:01,480 But the idea here is since x is a random variable-- 105 00:06:01,480 --> 00:06:04,300 in other words, it can't take on infinite values-- 106 00:06:04,300 --> 00:06:06,220 except with probability 0. 107 00:06:06,220 --> 00:06:08,540 So it's always finite. 108 00:06:08,540 --> 00:06:10,910 So when you add a bunch of them you get something which 109 00:06:10,910 --> 00:06:12,180 is still finite. 110 00:06:12,180 --> 00:06:14,650 So that s sub n is a random variable. 111 00:06:14,650 --> 00:06:19,960 In other words, if you look at the probability that s sub n 112 00:06:19,960 --> 00:06:23,830 is less than or equal to t, and then you let t go off to 113 00:06:23,830 --> 00:06:27,700 infinity, the fact that s sub n is a random variable means 114 00:06:27,700 --> 00:06:30,400 that the probability that sn is less than or equal 115 00:06:30,400 --> 00:06:33,680 to t goes to 1. 116 00:06:33,680 --> 00:06:37,420 And it does that for sample values with probability 1 is a 117 00:06:37,420 --> 00:06:38,816 better way to say it. 118 00:06:43,220 --> 00:06:47,850 Here I'm actually stating the convergence in probability 2, 119 00:06:47,850 --> 00:06:51,110 because it follows from the convergence 120 00:06:51,110 --> 00:06:52,330 with probability 1. 121 00:06:52,330 --> 00:06:56,190 Since you have convergence with probability 1, n of t 122 00:06:56,190 --> 00:07:00,820 over t also converges in probability, which says that 123 00:07:00,820 --> 00:07:05,310 the probability as t goes to infinity that n of t over t 124 00:07:05,310 --> 00:07:09,220 minus 1 over x-bar magnitude is greater than epsilon, is 125 00:07:09,220 --> 00:07:10,470 equal to 0. 126 00:07:12,500 --> 00:07:16,520 That's this funny theorem we proved about convergence with 127 00:07:16,520 --> 00:07:17,740 probability 1. 128 00:07:17,740 --> 00:07:22,730 Set of random variables implies convergence in 129 00:07:22,730 --> 00:07:23,980 probability. 130 00:07:25,870 --> 00:07:31,060 Here's another theorem about convergence which is called 131 00:07:31,060 --> 00:07:32,680 the Elementary Renewal Theorem. 132 00:07:32,680 --> 00:07:36,850 We talked about that, we proved half of it. 133 00:07:36,850 --> 00:07:41,810 After we talked about talked about Wald's equality. 134 00:07:41,810 --> 00:07:43,770 And we said the other half really wasn't very 135 00:07:43,770 --> 00:07:45,340 interesting. 136 00:07:45,340 --> 00:07:49,990 And I hope some of you at least looked at that. 137 00:07:49,990 --> 00:07:52,270 After you look at it, it's a bunch of mathematics, and a 138 00:07:52,270 --> 00:07:54,740 bunch of equations. 139 00:07:54,740 --> 00:07:57,290 And it says this-- 140 00:07:57,290 --> 00:08:00,610 so we have three limits theorems about n of t. 141 00:08:00,610 --> 00:08:04,570 About what happens when t gets large there're this number of 142 00:08:04,570 --> 00:08:07,210 renewals that occur, and the time n of t. 143 00:08:07,210 --> 00:08:10,450 One of them is a strong law, which is really 144 00:08:10,450 --> 00:08:12,580 a sample path average. 145 00:08:12,580 --> 00:08:15,860 I'm trying to start using the words sample path average 146 00:08:15,860 --> 00:08:19,550 instead of time average because I think it gives you a 147 00:08:19,550 --> 00:08:22,590 better idea of what's actually going on. 148 00:08:22,590 --> 00:08:29,230 But the strong law is really a sample path argument. 149 00:08:29,230 --> 00:08:33,210 The weak law-- 150 00:08:33,210 --> 00:08:36,690 this thing here about convergence and probability-- 151 00:08:36,690 --> 00:08:39,480 it still tells you quite a bit, because it tells you that 152 00:08:39,480 --> 00:08:44,400 as t it gets large, the probability that n of t over t 153 00:08:44,400 --> 00:08:47,210 can be significantly different than one over 154 00:08:47,210 --> 00:08:50,840 x-bar is going to 0. 155 00:08:50,840 --> 00:08:55,230 This in a sense tells you even less. 156 00:08:55,230 --> 00:08:58,710 I mean why does this tell you less than this does? 157 00:08:58,710 --> 00:09:02,120 What's significant thing does this tell you that this 158 00:09:02,120 --> 00:09:03,370 doesn't tell you? 159 00:09:07,220 --> 00:09:14,060 Suppose we had a situation where half the time with 160 00:09:14,060 --> 00:09:22,200 probability 1/2 n of t over t is equal to 2 over x4, and the 161 00:09:22,200 --> 00:09:24,612 other half of the time it's equal to 0. 162 00:09:24,612 --> 00:09:27,480 That can't happen, But according to 163 00:09:27,480 --> 00:09:29,170 this it could happen. 164 00:09:29,170 --> 00:09:31,740 The expected value of n of t over t would 165 00:09:31,740 --> 00:09:33,840 still be 1 over x4. 166 00:09:33,840 --> 00:09:37,480 But this statement doesn't tell you when you think about 167 00:09:37,480 --> 00:09:41,900 whether n of t over t is really squeezing down on 1 of 168 00:09:41,900 --> 00:09:44,720 x-bar, it just tells you that the expected value of it is 169 00:09:44,720 --> 00:09:48,100 squeezing down on 1 over x-bar. 170 00:09:48,100 --> 00:09:53,950 So this is really a pretty weak theorem, and you wonder 171 00:09:53,950 --> 00:09:57,050 why people spend so much time analyzing it? 172 00:09:57,050 --> 00:09:59,680 I'll tell you why in just a minute. 173 00:09:59,680 --> 00:10:03,060 And it's not it's not a pretty story. 174 00:10:03,060 --> 00:10:07,320 We talked about residual life. 175 00:10:07,320 --> 00:10:11,840 I want to use this which I think you all understand-- 176 00:10:11,840 --> 00:10:16,710 I mean for residual life, and for duration and age you draw 177 00:10:16,710 --> 00:10:19,240 this picture, and that is perfectly clear 178 00:10:19,240 --> 00:10:21,380 what's going on. 179 00:10:21,380 --> 00:10:24,170 So I don't think there's any possibility of 180 00:10:24,170 --> 00:10:26,090 confusion with that. 181 00:10:26,090 --> 00:10:32,660 Here's the original picture of a sample path picture of a 182 00:10:32,660 --> 00:10:37,030 rival apex, of the number of arrivals up until time t 183 00:10:37,030 --> 00:10:38,880 climbing up. 184 00:10:38,880 --> 00:10:41,640 And then we look at residual life, the amount of time at 185 00:10:41,640 --> 00:10:45,080 any time until the next arrival comes. 186 00:10:45,080 --> 00:10:49,050 This is strictly a sample path idea, for a particular sample 187 00:10:49,050 --> 00:10:55,680 path, from 0 to infinity, you look at the whole thing. 188 00:10:55,680 --> 00:10:59,890 In other words, think of setting up an experiment. 189 00:10:59,890 --> 00:11:05,820 And this experiment you view with the entire sample path 190 00:11:05,820 --> 00:11:08,735 for this particular sample point that 191 00:11:08,735 --> 00:11:10,570 you're talking about. 192 00:11:10,570 --> 00:11:13,380 You don't stop at any time, you just keep on going. 193 00:11:13,380 --> 00:11:16,170 Obviously you can't keep on going forever, but you keep on 194 00:11:16,170 --> 00:11:19,610 going long enough that you get totally bored, and say well 195 00:11:19,610 --> 00:11:22,840 I'm not interested in anything after 20 years. 196 00:11:22,840 --> 00:11:25,620 And nobody will be interested in my results if i wait more 197 00:11:25,620 --> 00:11:28,080 than 20 years, and I'll be dead if I wait 198 00:11:28,080 --> 00:11:30,330 much slower than that. 199 00:11:30,330 --> 00:11:33,420 So you say we will take this sample path 200 00:11:33,420 --> 00:11:35,210 for a very long time. 201 00:11:35,210 --> 00:11:38,220 This is the sample path that we get. 202 00:11:38,220 --> 00:11:42,340 We then argue that the integral of y of t over t as a 203 00:11:42,340 --> 00:11:47,360 sum of terms, this is a random variable here. y of t is a 204 00:11:47,360 --> 00:11:48,890 random variable. 205 00:11:48,890 --> 00:11:49,660 It's a number. 206 00:11:49,660 --> 00:11:53,420 If I put in a particular sample point, each of these 207 00:11:53,420 --> 00:11:56,300 terms here are random variables. 208 00:11:56,300 --> 00:12:00,470 The sum of them is a random variable. 209 00:12:00,470 --> 00:12:03,000 And if I put in a particular sample point, 210 00:12:03,000 --> 00:12:05,400 it's a sum of numbers. 211 00:12:05,400 --> 00:12:10,910 Now, we did the following thing with that-- 212 00:12:10,910 --> 00:12:14,710 I think it was pretty straightforward. 213 00:12:14,710 --> 00:12:18,100 You look at what the sum is up to n of t. 214 00:12:18,100 --> 00:12:20,840 In other words, for a particular time that you're 215 00:12:20,840 --> 00:12:26,710 looking at, the experiment that you do is you integrate y 216 00:12:26,710 --> 00:12:29,610 of t-- which is this residual life function-- 217 00:12:29,610 --> 00:12:31,310 from 0 to t. 218 00:12:31,310 --> 00:12:34,880 At the same time, at time t there's a certain number of 219 00:12:34,880 --> 00:12:39,370 renewals that have occurred, and you look at 1 over 2t 220 00:12:39,370 --> 00:12:44,520 times the sum of x of n squared, up to that point, not 221 00:12:44,520 --> 00:12:47,940 counting this last little bit of stuff here, and then you 222 00:12:47,940 --> 00:12:51,960 upper bound it by this sum, counting this little bit of 223 00:12:51,960 --> 00:12:53,890 extra stuff here. 224 00:12:53,890 --> 00:12:59,110 And we pretty much proved in class and in the notes that 225 00:12:59,110 --> 00:13:01,840 this little extra thing at the end doesn't make any 226 00:13:01,840 --> 00:13:03,750 difference even if it's very big. 227 00:13:03,750 --> 00:13:07,910 Because you're summing over such a long period of time. 228 00:13:07,910 --> 00:13:10,090 That's one argument involved there. 229 00:13:10,090 --> 00:13:13,560 The other argument involved is really very hidden, and you 230 00:13:13,560 --> 00:13:17,550 don't see it unless you write things down very carefully. 231 00:13:17,550 --> 00:13:19,540 But it tells you why the strong law of 232 00:13:19,540 --> 00:13:21,490 numbers are so important. 233 00:13:21,490 --> 00:13:23,540 So I wanted to talk about here a little bit. 234 00:13:27,330 --> 00:13:29,500 What that says-- 235 00:13:29,500 --> 00:13:35,290 I mean the thing that is kind of fishy here, is here we're 236 00:13:35,290 --> 00:13:39,310 summing up to n of t, and we don't really 237 00:13:39,310 --> 00:13:41,420 know what n of t is. 238 00:13:41,420 --> 00:13:45,820 It depends on how many arrivals have occurred. 239 00:13:45,820 --> 00:13:48,740 And if you write this out carefully as a sample path 240 00:13:48,740 --> 00:13:53,970 statement what is it saying? 241 00:13:53,970 --> 00:13:56,560 Let's go into the next slide, and we'll 242 00:13:56,560 --> 00:13:58,590 see what it's saying. 243 00:13:58,590 --> 00:14:00,280 For the sample point omega-- 244 00:14:00,280 --> 00:14:03,520 let's assume for the moment that this limit exists-- 245 00:14:03,520 --> 00:14:08,700 what you're talking about is the sum from n equals 1 up to 246 00:14:08,700 --> 00:14:12,570 the number of arrivals that have taken place up until time 247 00:14:12,570 --> 00:14:15,440 t for this particular sample point. 248 00:14:15,440 --> 00:14:22,100 And it's the sum of these squares of these inner renewal 249 00:14:22,100 --> 00:14:25,940 times, and we're dividing by 2t, because we want to find 250 00:14:25,940 --> 00:14:29,440 the rate at which this is all going on. 251 00:14:29,440 --> 00:14:31,370 We write it out then as a limit. 252 00:14:35,218 --> 00:14:40,040 of x of n squared over omega divided by n of t of omega, 253 00:14:40,040 --> 00:14:42,490 times n of t of omega, divided by 2 2t. 254 00:14:42,490 --> 00:14:46,710 In other words, we simply multiply and divide 255 00:14:46,710 --> 00:14:48,740 by n of t in omega. 256 00:14:48,740 --> 00:14:50,440 Why do we want to do that? 257 00:14:50,440 --> 00:14:55,140 Because this expression here looks very familiar. 258 00:14:55,140 --> 00:14:56,830 It's a sum of random variables. 259 00:14:56,830 --> 00:15:00,680 With the sum of n of t of omega random variables, but we 260 00:15:00,680 --> 00:15:05,300 know that as t gets large, n of t of omega gets large also. 261 00:15:07,820 --> 00:15:12,440 So that we know that with probability 1, as t approaches 262 00:15:12,440 --> 00:15:16,000 infinity, this sum here-- 263 00:15:16,000 --> 00:15:17,810 if we forget about that term-- 264 00:15:17,810 --> 00:15:20,400 this sum here by the strong law of large 265 00:15:20,400 --> 00:15:24,210 numbers is equal to-- 266 00:15:24,210 --> 00:15:25,460 well in fact by the weak law of large numbers-- 267 00:15:28,220 --> 00:15:35,550 it is equal to the expected value of x squared. 268 00:15:35,550 --> 00:15:40,012 If we take the limit of this term, we get the limit of n of 269 00:15:40,012 --> 00:15:41,980 t of omega over 2t. 270 00:15:41,980 --> 00:15:46,110 The strong law for renewals tells us that is equal to 1 271 00:15:46,110 --> 00:15:48,270 over the expected value of x. 272 00:15:48,270 --> 00:15:50,460 So that gives us our answer. 273 00:15:50,460 --> 00:15:56,480 Now why can we take a limit over this times this, and say 274 00:15:56,480 --> 00:16:00,000 that that's equal to the limit of this times 275 00:16:00,000 --> 00:16:01,740 the limit of that? 276 00:16:01,740 --> 00:16:05,510 If you're dealing with random variables, that's not correct 277 00:16:05,510 --> 00:16:07,430 in general. 278 00:16:07,430 --> 00:16:09,640 But here what we're dealing with-- 279 00:16:09,640 --> 00:16:13,240 as soon as we put this omega in, we're dealing with a sum 280 00:16:13,240 --> 00:16:14,980 of numbers. 281 00:16:14,980 --> 00:16:17,560 And here we're dealing with a number also. 282 00:16:17,560 --> 00:16:19,860 For every value of t, this is the number. 283 00:16:19,860 --> 00:16:23,160 In other words, this is a function of t numerical 284 00:16:23,160 --> 00:16:26,050 function of t, a real valued function of t. 285 00:16:26,050 --> 00:16:28,932 This is a real valued function of t. 286 00:16:28,932 --> 00:16:34,490 And what do you know about the limit of a product of two 287 00:16:34,490 --> 00:16:36,380 sequences of real numbers? 288 00:16:40,900 --> 00:16:45,400 If you know a little bit of analysis, then you know that 289 00:16:45,400 --> 00:16:49,400 you're going to take the limit of a product, and take the 290 00:16:49,400 --> 00:16:54,770 sequence of that limit, then the answer that you get is the 291 00:16:54,770 --> 00:16:57,660 first limit times the second limit. 292 00:16:57,660 --> 00:17:00,200 OK? 293 00:17:00,200 --> 00:17:03,000 I mean you might not recognize the statement in that 294 00:17:03,000 --> 00:17:09,630 generality, but if I ask you what it is the sum 295 00:17:09,630 --> 00:17:13,310 of a n times b n? 296 00:17:13,310 --> 00:17:22,230 If we know that the limit of a n is equal to say, a? 297 00:17:22,230 --> 00:17:27,520 And the limit of b n is equal to b? 298 00:17:27,520 --> 00:17:34,320 Then we know that this sum here, in the limit the limit 299 00:17:34,320 --> 00:17:38,860 is n to infinity, it's just going to be a times b. 300 00:17:38,860 --> 00:17:41,670 And you're going to sit down and argue that for yourselves 301 00:17:41,670 --> 00:17:43,880 looking at the definition of what a limit is. 302 00:17:43,880 --> 00:17:47,190 So it's not a complicated thing. 303 00:17:47,190 --> 00:17:50,240 But you can't do that if you're not dealing with a 304 00:17:50,240 --> 00:17:53,740 sample path notion of convergence here. 305 00:17:53,740 --> 00:17:55,840 You can't make that connection. 306 00:17:55,840 --> 00:17:59,790 If you only want to deal with the weak law of large numbers, 307 00:17:59,790 --> 00:18:03,350 if you want to say infinity doesn't really make any sense 308 00:18:03,350 --> 00:18:06,580 because it doesn't exist, I can't wait that long. 309 00:18:06,580 --> 00:18:09,330 And therefore the strong law of large numbers 310 00:18:09,330 --> 00:18:10,770 doesn't make any sense. 311 00:18:10,770 --> 00:18:14,600 You can't go through that argument, and you can't get 312 00:18:14,600 --> 00:18:17,850 this very useful result which says-- 313 00:18:20,560 --> 00:18:21,940 What was I trying to prove? 314 00:18:21,940 --> 00:18:27,850 I was trying to prove that the expected value of residual 315 00:18:27,850 --> 00:18:31,040 life as a time average is equal to the expected value of 316 00:18:31,040 --> 00:18:34,530 x squared divided by 2 times the expected value of x. 317 00:18:34,530 --> 00:18:37,200 This makes sense over finite times. 318 00:18:37,200 --> 00:18:37,780 Yes? 319 00:18:37,780 --> 00:18:39,030 AUDIENCE: [INAUDIBLE]? 320 00:18:44,800 --> 00:18:45,650 ROBERT GALLAGER: Yeah. 321 00:18:45,650 --> 00:18:47,470 No, I want to divide by n. 322 00:18:50,120 --> 00:18:55,726 And then I think that makes it all right, OK? 323 00:18:55,726 --> 00:18:56,976 Something like that. 324 00:19:02,710 --> 00:19:05,448 Let's see, is that right the way I have it now? 325 00:19:05,448 --> 00:19:06,698 AUDIENCE: [INAUDIBLE]. 326 00:19:15,510 --> 00:19:17,525 ROBERT GALLAGER: Yeah I want to do that, it's right too, 327 00:19:17,525 --> 00:19:20,701 but if I want to take the summation-- 328 00:19:20,701 --> 00:19:24,795 oh no, the summation makes it messier, you're right. 329 00:19:24,795 --> 00:19:27,400 The thing I'm trying to state is the limit as n goes to 330 00:19:27,400 --> 00:19:31,790 infinity of an times bn, is a times b. 331 00:19:31,790 --> 00:19:34,990 And there are restrictions there, you can't have limit of 332 00:19:34,990 --> 00:19:37,940 an going to 0 or something, and the limit of bn going to 333 00:19:37,940 --> 00:19:41,350 infinity, or strange things like that. 334 00:19:41,350 --> 00:19:43,560 But what I'm arguing here-- 335 00:19:43,560 --> 00:19:45,650 I don't want you getting involved with this because I 336 00:19:45,650 --> 00:19:47,750 haven't really thought about it. 337 00:19:47,750 --> 00:19:52,120 What I want you to think about is the fact that you can use 338 00:19:52,120 --> 00:19:57,620 the laws of analysis for real numbers and whether you've 339 00:19:57,620 --> 00:20:00,570 studied analysis or not, you're all familiar with those 340 00:20:00,570 --> 00:20:03,720 because you all use them all the time. 341 00:20:03,720 --> 00:20:05,940 And when you're dealing with the strong law of large 342 00:20:05,940 --> 00:20:09,310 numbers, you can convert everything down to a sample 343 00:20:09,310 --> 00:20:13,320 path notion, and then you're simply dealing with limits of 344 00:20:13,320 --> 00:20:15,540 real numbers at that point. 345 00:20:15,540 --> 00:20:17,315 So you don't have to do anything fancy. 346 00:20:21,240 --> 00:20:27,820 So this result would be hard to do in 347 00:20:27,820 --> 00:20:29,740 terms of ensemble averages. 348 00:20:29,740 --> 00:20:33,110 If you look at the end of Chapter 4 in the notes, you 349 00:20:33,110 --> 00:20:35,010 see that the arguments there get very 350 00:20:35,010 --> 00:20:36,570 tricky, and very involved. 351 00:20:36,570 --> 00:20:39,830 It does the same thing eventually, but in a much 352 00:20:39,830 --> 00:20:42,270 harder way. 353 00:20:42,270 --> 00:20:45,550 OK, for the sample point omega, oh, we did that. 354 00:20:48,690 --> 00:20:52,300 OK, Residual life and duration are examples of renewal reward 355 00:20:52,300 --> 00:20:57,010 functions, so this is just saying what we've already said 356 00:20:57,010 --> 00:21:02,040 so let's not dwell on it anymore. 357 00:21:02,040 --> 00:21:04,830 OK stopping trials. 358 00:21:04,830 --> 00:21:07,940 Stopping trials will be on the quiz. 359 00:21:07,940 --> 00:21:11,060 The Wald equality will be on the quiz. 360 00:21:11,060 --> 00:21:14,790 We will get solutions to problem set seven back to you, 361 00:21:14,790 --> 00:21:17,850 although we won't get your graded solutions back to you 362 00:21:17,850 --> 00:21:18,840 before the quiz. 363 00:21:18,840 --> 00:21:20,700 I hope we'll get them out tomorrow. 364 00:21:23,320 --> 00:21:24,720 I hope. 365 00:21:24,720 --> 00:21:27,400 Yes, we will. 366 00:21:27,400 --> 00:21:30,420 They will be on the web. 367 00:21:30,420 --> 00:21:37,390 Stopping trial is a positive integer valued random variable 368 00:21:37,390 --> 00:21:42,960 such as for each n, the indicator random variable 369 00:21:42,960 --> 00:21:44,670 indicator of j equals n. 370 00:21:44,670 --> 00:21:47,660 In other words, the random variable which takes the value 371 00:21:47,660 --> 00:21:52,970 1 if this random variable j is equal to 1, takes the value of 372 00:21:52,970 --> 00:21:55,560 0 otherwise. 373 00:21:55,560 --> 00:22:01,840 If that random variable is a function of x1 up to x of n. 374 00:22:01,840 --> 00:22:05,670 If you look at x1 to x of n, and you can tell from just 375 00:22:05,670 --> 00:22:08,580 looking at that and not looking at the future at all 376 00:22:08,580 --> 00:22:10,720 whether you're going to stop at time n. 377 00:22:10,720 --> 00:22:12,870 Then you call that a stopping trial. 378 00:22:12,870 --> 00:22:17,600 And we generalize that to look at a sequence x sub n, x sub 1 379 00:22:17,600 --> 00:22:20,900 to x sub n and some other set of random variables-- 380 00:22:20,900 --> 00:22:23,670 v sub 1 to v sub n. 381 00:22:23,670 --> 00:22:25,900 And the same argument. 382 00:22:25,900 --> 00:22:30,160 If the rule to stop is a rule which is based on only what 383 00:22:30,160 --> 00:22:32,520 you've seen up until time n, then it's called 384 00:22:32,520 --> 00:22:34,680 the stopping trial. 385 00:22:34,680 --> 00:22:37,950 Possibly the effect of stopping trial it's the same 386 00:22:37,950 --> 00:22:40,440 except that j might be a defective random variable. 387 00:22:40,440 --> 00:22:43,600 In other words, there might be some small probability that 388 00:22:43,600 --> 00:22:46,220 you never stop, that you just keep on going forever. 389 00:22:46,220 --> 00:22:49,560 When you look at one of these problems, you use stopping 390 00:22:49,560 --> 00:22:54,700 rules on, it's not immediately evident before you start to 391 00:22:54,700 --> 00:22:58,130 analyze it what do you ever stop or not. 392 00:22:58,130 --> 00:23:00,570 So you have to analyze it somewhat before you know 393 00:23:00,570 --> 00:23:02,400 whether you're going to stop. 394 00:23:02,400 --> 00:23:05,890 So it's nice to do things in terms of defective stopping 395 00:23:05,890 --> 00:23:09,900 rules, because what you can do there holds true 396 00:23:09,900 --> 00:23:12,070 whether or not stop. 397 00:23:12,070 --> 00:23:17,790 Wald's equality then says if these random variables or a 398 00:23:17,790 --> 00:23:23,830 sequence, IID sequence they each have a mean x-bar, and if 399 00:23:23,830 --> 00:23:29,040 j is a stopping trial, and if the expected value of j is 400 00:23:29,040 --> 00:23:30,720 less than infinity-- 401 00:23:30,720 --> 00:23:33,140 in other words, if it exists-- 402 00:23:33,140 --> 00:23:35,680 then the sum at the stopping trial satisfied-- 403 00:23:35,680 --> 00:23:39,840 the expected value of the sum equals expected value of x 404 00:23:39,840 --> 00:23:42,580 times the expected value of j. 405 00:23:42,580 --> 00:23:46,310 Those of you who did the homework this week noticed 406 00:23:46,310 --> 00:23:51,410 three examples of where this is used not to find the 407 00:23:51,410 --> 00:23:54,770 expected value of s sub j, but where it's used to find the 408 00:23:54,770 --> 00:23:57,610 expected value of j. 409 00:23:57,610 --> 00:24:03,420 And I guess 90% percent of the examples I've seen do exactly 410 00:24:03,420 --> 00:24:08,900 that you can find the expected value of s of j very easily. 411 00:24:08,900 --> 00:24:11,190 I mean you have an experiment where you keep going until 412 00:24:11,190 --> 00:24:12,880 something happens. 413 00:24:12,880 --> 00:24:16,060 Something happens, this is the sum of these random variables 414 00:24:16,060 --> 00:24:17,460 reaches some limit. 415 00:24:17,460 --> 00:24:19,960 And when they reach the limit you stop. 416 00:24:19,960 --> 00:24:21,950 If when they reach the limit you stop, you know 417 00:24:21,950 --> 00:24:22,750 what that limit is. 418 00:24:22,750 --> 00:24:26,010 You know what the expected value of s sub j is, because 419 00:24:26,010 --> 00:24:28,560 that's where you stop. 420 00:24:28,560 --> 00:24:32,370 And from that, if what x-bar is you then know what the 421 00:24:32,370 --> 00:24:34,170 expected value of j is. 422 00:24:34,170 --> 00:24:37,770 So we should really state it as expected value of j is 423 00:24:37,770 --> 00:24:43,160 equal to the expected value of s sub j divided by the 424 00:24:43,160 --> 00:24:46,120 expected value of x, because that's where 425 00:24:46,120 --> 00:24:47,370 you usually use it. 426 00:24:51,040 --> 00:24:57,820 OK this question of whether the expected value of j has to 427 00:24:57,820 --> 00:25:00,540 be less than infinity or not. 428 00:25:00,540 --> 00:25:04,260 If the random variable x you're dealing with, is a 429 00:25:04,260 --> 00:25:08,350 positive random variable, then you don't need to worry about 430 00:25:08,350 --> 00:25:10,280 that restriction. 431 00:25:10,280 --> 00:25:13,030 The only time when you have to worry about this restriction 432 00:25:13,030 --> 00:25:17,620 is where x can be both positive and negative. 433 00:25:17,620 --> 00:25:21,530 And then you have to worry about it a little bit. 434 00:25:21,530 --> 00:25:24,810 If you don't understand what I just said, go back and look at 435 00:25:24,810 --> 00:25:27,640 that example of stop when you're ahead. 436 00:25:27,640 --> 00:25:31,280 Because the example of stop when you're ahead, you can't 437 00:25:31,280 --> 00:25:34,870 use Wald's equality there, it doesn't apply. 438 00:25:34,870 --> 00:25:37,950 Because the expected amount of time until you stop is equal 439 00:25:37,950 --> 00:25:41,510 to infinity, and the random variable has both positive and 440 00:25:41,510 --> 00:25:42,930 negative values. 441 00:25:42,930 --> 00:25:47,790 And because of that, the whole thing breaks down. 442 00:25:47,790 --> 00:25:49,040 OK. 443 00:25:51,550 --> 00:25:54,650 Let's talk a little bit about Little's theorem. 444 00:25:54,650 --> 00:25:59,380 As we said last time, Little's theorem is essentially an 445 00:25:59,380 --> 00:26:00,630 accounting trick. 446 00:26:02,800 --> 00:26:05,250 I should tell you something about how I got into teaching 447 00:26:05,250 --> 00:26:07,390 this course. 448 00:26:07,390 --> 00:26:09,970 I got into teaching it because I was working on 449 00:26:09,970 --> 00:26:12,170 networks at the time. 450 00:26:12,170 --> 00:26:15,400 Queuing was essential in networks. 451 00:26:15,400 --> 00:26:18,740 And mathematicians have taken over the queuing field. 452 00:26:18,740 --> 00:26:21,070 And the results were so complicated, I couldn't 453 00:26:21,070 --> 00:26:23,140 understand them. 454 00:26:23,140 --> 00:26:24,970 So I started teaching it as a way of trying 455 00:26:24,970 --> 00:26:26,010 to understand them. 456 00:26:26,010 --> 00:26:28,770 And I looked at Little's theorem, and like any 457 00:26:28,770 --> 00:26:31,100 engineer, I said, aha. 458 00:26:31,100 --> 00:26:36,410 What's going on here is that the sum of these waiting times 459 00:26:36,410 --> 00:26:40,198 is equal to the integral of L of t, the difference between A 460 00:26:40,198 --> 00:26:44,360 of t and D of t, as you proceed. 461 00:26:44,360 --> 00:26:47,480 So there's this equality here. 462 00:26:47,480 --> 00:26:51,840 If I look at this next busy period, I 463 00:26:51,840 --> 00:26:53,406 have that same equality. 464 00:26:53,406 --> 00:26:55,910 If I look at the next busy period, I 465 00:26:55,910 --> 00:26:57,810 have the same in equality. 466 00:26:57,810 --> 00:27:01,480 And anybody with any smidgen of common sense knows that 467 00:27:01,480 --> 00:27:08,180 that little amount of business at the end, about that final 468 00:27:08,180 --> 00:27:11,060 period, can't make any difference. 469 00:27:11,060 --> 00:27:14,100 And because of that, Little's theorem is just this 470 00:27:14,100 --> 00:27:15,515 accounting equality. 471 00:27:15,515 --> 00:27:19,460 It says that the sum of the w's is equal to the integral 472 00:27:19,460 --> 00:27:23,520 of L. And that's all there is to it. 473 00:27:23,520 --> 00:27:26,570 When you look at this more, and you look at funny queuing 474 00:27:26,570 --> 00:27:30,380 situations, you start to realize that these busy 475 00:27:30,380 --> 00:27:34,740 periods can take very long periods of time. 476 00:27:34,740 --> 00:27:36,710 They might be infinite. 477 00:27:36,710 --> 00:27:39,870 All sorts of strange things could happen. 478 00:27:39,870 --> 00:27:42,175 So you would like to be able to prove something. 479 00:27:44,830 --> 00:27:48,520 Now, what happens when you try to prove it, this is the other 480 00:27:48,520 --> 00:27:53,430 reason why I ignored it when I started teaching this course. 481 00:27:53,430 --> 00:27:55,070 Because I didn't understand the strong 482 00:27:55,070 --> 00:27:56,090 law of large numbers. 483 00:27:56,090 --> 00:27:58,020 I didn't understand what it was. 484 00:27:58,020 --> 00:28:00,730 Nobody had ever told me that this was a theorem about 485 00:28:00,730 --> 00:28:03,300 sample values. 486 00:28:03,300 --> 00:28:05,270 So I tried to prove it. 487 00:28:05,270 --> 00:28:07,380 And I said the following thing. 488 00:28:07,380 --> 00:28:11,160 The expected value of L by definition is 1 489 00:28:11,160 --> 00:28:13,700 over t times the limit. 490 00:28:13,700 --> 00:28:17,930 L is the number of customers in the system at time t. 491 00:28:17,930 --> 00:28:20,340 So we're going to integrate the number of customers in a 492 00:28:20,340 --> 00:28:24,370 system over all this period t. 493 00:28:24,370 --> 00:28:26,630 And I'm going to divide by 1 over t. 494 00:28:26,630 --> 00:28:28,650 Oh my God. 495 00:28:28,650 --> 00:28:32,020 Would you please interchange that limit in the 1 over t? 496 00:28:34,750 --> 00:28:37,320 I mean it's obvious when you look at it, that has to be 497 00:28:37,320 --> 00:28:38,570 what it is. 498 00:28:41,580 --> 00:28:44,960 And by this accounting identity, this is equal to the 499 00:28:44,960 --> 00:28:49,630 limit of this sum from I equals zero to N of t or 500 00:28:49,630 --> 00:28:51,180 w sub I over t. 501 00:28:51,180 --> 00:28:56,710 With this question of having omitted this last little busy 502 00:28:56,710 --> 00:29:01,100 period, whatever part of it you're in when you get the t. 503 00:29:01,100 --> 00:29:02,980 I mean, that's the part that's common sense. 504 00:29:02,980 --> 00:29:04,990 You know you can do that. 505 00:29:04,990 --> 00:29:08,180 Now lambda is equal to the limit of 1 over 506 00:29:08,180 --> 00:29:10,290 t times A of t. 507 00:29:10,290 --> 00:29:17,150 A of t is just the number of arrivals up until time t. 508 00:29:17,150 --> 00:29:21,150 A of t, when we're doing Little's theorem, counts this 509 00:29:21,150 --> 00:29:24,770 fictitious arrival at time zero. 510 00:29:24,770 --> 00:29:29,690 Or I should say that the renewal theory omits as 511 00:29:29,690 --> 00:29:33,670 fictitious the real arrival at time zero, which is what's 512 00:29:33,670 --> 00:29:36,260 going on in Little's theorem. 513 00:29:36,260 --> 00:29:42,240 So then the expected value of w is going to be the limit as 514 00:29:42,240 --> 00:29:46,930 t approaches infinity of 1 over a of t times the sum of w 515 00:29:46,930 --> 00:29:51,070 sub I. I'm going to break that up in the same way I broke 516 00:29:51,070 --> 00:29:52,950 this thing here up. 517 00:29:52,950 --> 00:29:59,360 It's the limit of t over a of t times the limit of w sub I 518 00:29:59,360 --> 00:30:02,250 from I equals 1 to a of t, 1 over t. 519 00:30:02,250 --> 00:30:06,250 Breaking up this limit here requires taking this sample 520 00:30:06,250 --> 00:30:08,420 pass view again. 521 00:30:08,420 --> 00:30:13,210 In other words, you look at a particular sample point omega. 522 00:30:13,210 --> 00:30:18,390 And for that particular sample point omega, you simply had 523 00:30:18,390 --> 00:30:21,130 the same thing as I was saying here. 524 00:30:21,130 --> 00:30:23,110 And I don't know whether I said it right or not. 525 00:30:23,110 --> 00:30:26,120 But anyway, that's what we're using. 526 00:30:26,120 --> 00:30:28,420 It does work for real numbers. 527 00:30:28,420 --> 00:30:34,110 And therefore, what we wind up with is this, which is 1 over 528 00:30:34,110 --> 00:30:40,410 lambda, and this, which is the expected value of L, from 529 00:30:40,410 --> 00:30:42,460 this, from the accounting identity. 530 00:30:42,460 --> 00:30:42,760 OK. 531 00:30:42,760 --> 00:30:47,790 So again, you're using the strong law as a way to get 532 00:30:47,790 --> 00:30:52,430 from random variables to numbers. 533 00:30:52,430 --> 00:30:56,940 And you understand how numbers work. 534 00:30:56,940 --> 00:31:00,720 OK, one more example of this same idea. 535 00:31:00,720 --> 00:31:07,750 One of the problems in the homework, problem set six are 536 00:31:07,750 --> 00:31:09,210 problems I thought-- yes. 537 00:31:09,210 --> 00:31:14,290 AUDIENCE: About the previous slide, that's from [INAUDIBLE] 538 00:31:14,290 --> 00:31:15,290 right? 539 00:31:15,290 --> 00:31:16,432 PROFESSOR: Yes, yes. 540 00:31:16,432 --> 00:31:17,916 Sorry. 541 00:31:17,916 --> 00:31:20,290 AUDIENCE: Then how-- 542 00:31:20,290 --> 00:31:25,420 is there an easy way to go from there to the [INAUDIBLE] 543 00:31:25,420 --> 00:31:26,334 distributions? 544 00:31:26,334 --> 00:31:29,010 PROFESSOR: Oh, to the ensemble average. 545 00:31:29,010 --> 00:31:30,820 Yes, there is, but not if you don't read the 546 00:31:30,820 --> 00:31:32,950 rest of Chapter 4. 547 00:31:32,950 --> 00:31:39,050 And it's not something I'm going to dwell on in class. 548 00:31:39,050 --> 00:31:40,300 It's--- 549 00:31:44,520 --> 00:31:47,620 it's something which is mathematically messy and 550 00:31:47,620 --> 00:31:49,750 fairly intricate. 551 00:31:49,750 --> 00:32:00,470 And in terms of common sense, you realize it has to be true. 552 00:32:00,470 --> 00:32:04,250 I mean, if the ensemble average, up until time t, is 553 00:32:04,250 --> 00:32:10,580 approaching the limit, then you must have the situation 554 00:32:10,580 --> 00:32:16,990 that that limit is equal to the sample path average. 555 00:32:16,990 --> 00:32:20,290 The question is whether it's approaching a limit or not. 556 00:32:20,290 --> 00:32:22,290 And that's not too hard to prove. 557 00:32:22,290 --> 00:32:25,980 But then you have all this mathematics of going through 558 00:32:25,980 --> 00:32:31,010 the details of it, which in fact is tricky. 559 00:32:31,010 --> 00:32:31,600 OK. 560 00:32:31,600 --> 00:32:35,170 So back to mark up change and renewal processes. 561 00:32:35,170 --> 00:32:39,310 You remember you went through a rather tedious problem where 562 00:32:39,310 --> 00:32:41,790 you were supposed to use Chebyshev's inequality to 563 00:32:41,790 --> 00:32:44,180 prove something. 564 00:32:44,180 --> 00:32:47,440 And maybe half of you recognized that it would be 565 00:32:47,440 --> 00:32:51,060 far easier to use Markov's inequality. 566 00:32:51,060 --> 00:32:56,190 And if you did that, this is what I'm trying to do here. 567 00:32:56,190 --> 00:33:00,160 The question is, if you look at the expected amount of 568 00:33:00,160 --> 00:33:06,350 time, from state I in a Markov chain, until you return to 569 00:33:06,350 --> 00:33:12,270 state I in the Markov chain again, and you can do that by 570 00:33:12,270 --> 00:33:14,770 this theory of attaching rewards to 571 00:33:14,770 --> 00:33:16,910 states in a Markov chain. 572 00:33:16,910 --> 00:33:20,920 What you wind up with is this result that the expected 573 00:33:20,920 --> 00:33:27,400 renewal time in an ergodic Markov chain, is exactly equal 574 00:33:27,400 --> 00:33:31,910 to 1 over the steady state probability of that state. 575 00:33:31,910 --> 00:33:36,260 And if any of you find a simple and obvious way to 576 00:33:36,260 --> 00:33:39,420 prove that from the theory of Markov chains, I would be 577 00:33:39,420 --> 00:33:41,160 delighted to find it. 578 00:33:41,160 --> 00:33:44,270 Because I've never been able to find any way of doing that 579 00:33:44,270 --> 00:33:46,510 without going into renewal theory. 580 00:33:46,510 --> 00:33:50,100 And renewal theory lets you do it almost immediately. 581 00:33:50,100 --> 00:33:52,860 And it's a very useful resolve. 582 00:33:52,860 --> 00:33:54,120 So the argument is the following. 583 00:33:57,100 --> 00:33:59,840 You're going to let Y1, Y2, and so forth be the 584 00:33:59,840 --> 00:34:01,560 inter-renewal periods. 585 00:34:01,560 --> 00:34:04,080 Here we're looking at a sample path point of view again. 586 00:34:08,170 --> 00:34:09,719 No, we're not. 587 00:34:09,719 --> 00:34:12,844 Y1, Y2 are the random variables that are the 588 00:34:12,844 --> 00:34:15,010 inter-renewal periods. 589 00:34:15,010 --> 00:34:19,300 The elementary renewal theorem is something we talked about. 590 00:34:19,300 --> 00:34:24,840 It says that the expected value of n sub I of t divided 591 00:34:24,840 --> 00:34:30,230 by t is equal to 1 over the expected value of y. 592 00:34:34,230 --> 00:34:39,159 That's the elementary renewal theorem for renewal theory. 593 00:34:39,159 --> 00:34:42,630 So we've stopped talking about Markov chains now. 594 00:34:42,630 --> 00:34:46,080 We've said, for this Markov chain, you can look at 595 00:34:46,080 --> 00:34:51,060 recurrences from successive this is to state I. That forms 596 00:34:51,060 --> 00:34:53,429 a renewal process. 597 00:34:53,429 --> 00:34:55,840 And according to the elementary renewal theorem for 598 00:34:55,840 --> 00:35:00,710 renewal processes, this is equal to 1 over y bar. 599 00:35:00,710 --> 00:35:03,200 Now we go back to Markov chains again. 600 00:35:03,200 --> 00:35:09,790 Let's look at the probability of being in state I at time t, 601 00:35:09,790 --> 00:35:14,080 given that we were in state I at time zero. 602 00:35:14,080 --> 00:35:18,520 That's the probability that n sub I of t, minus n sub I of t 603 00:35:18,520 --> 00:35:20,370 minus 1 is equal to 1. 604 00:35:20,370 --> 00:35:26,380 That's the probability that there was an arrival at time 605 00:35:26,380 --> 00:35:30,160 t, which in terms of this renewal process means there 606 00:35:30,160 --> 00:35:32,990 was a visit to state I at time t. 607 00:35:32,990 --> 00:35:37,400 Every time you get to state I, you call it a renewal. 608 00:35:37,400 --> 00:35:41,150 We've defined a renewal process which gives you a 609 00:35:41,150 --> 00:35:45,500 reward of 1 every time you hit state I, and reward of zero 610 00:35:45,500 --> 00:35:47,080 all the rest of the time. 611 00:35:47,080 --> 00:35:50,550 So this is equal to probability of Ni of t minus 612 00:35:50,550 --> 00:35:52,300 Ni of t minus 1. 613 00:35:52,300 --> 00:35:55,370 The probability that that's equal to 1. 614 00:35:55,370 --> 00:35:58,865 So its expected value of N sub I of t minus N sub 615 00:35:58,865 --> 00:36:00,670 I of t minus 1. 616 00:36:00,670 --> 00:36:02,420 That's the expected value. 617 00:36:02,420 --> 00:36:04,980 This is always greater than or equal to this. 618 00:36:04,980 --> 00:36:07,570 This is either 1 or it's zero. 619 00:36:07,570 --> 00:36:09,860 You can't have two arrivals. 620 00:36:09,860 --> 00:36:13,190 You can't have two visits at the same time. 621 00:36:13,190 --> 00:36:15,080 So you add up all of these things. 622 00:36:15,080 --> 00:36:21,340 You sum this from n equals 1 up to t, and what do you get? 623 00:36:21,340 --> 00:36:24,810 You sum this, and it's a telescoping series. 624 00:36:24,810 --> 00:36:31,470 So you add expected value of n, 1 of t minus n 0 of t, 625 00:36:31,470 --> 00:36:36,390 which is 0, plus n2 of t minus n1 of t, plus n3 of 626 00:36:36,390 --> 00:36:38,060 t minus n2 of t. 627 00:36:38,060 --> 00:36:44,570 And everything cancels out except the n sub I of t. 628 00:36:44,570 --> 00:36:46,980 The I here is just the state we're looking at. 629 00:36:46,980 --> 00:36:48,810 We could've left it out. 630 00:36:48,810 --> 00:36:55,260 So p sub I, I of t, approaches pi sub I exponentially. 631 00:36:55,260 --> 00:36:58,660 Because down there, very fast, it stays very close. 632 00:36:58,660 --> 00:37:04,720 If we sum up over n of them, and these are quantities which 633 00:37:04,720 --> 00:37:09,330 are approaching this limit, pi sub I, exponentially fast, 634 00:37:09,330 --> 00:37:12,000 then the sum divided by the number of terms we're summing 635 00:37:12,000 --> 00:37:15,180 over is just pi sub I also. 636 00:37:15,180 --> 00:37:19,520 So what we have is a pi sub I that's equal to this limit. 637 00:37:19,520 --> 00:37:22,190 That's equal to the expected value of n sub I over t. 638 00:37:22,190 --> 00:37:27,770 The elementary renewal theorem reads out, which is equal to 1 639 00:37:27,770 --> 00:37:28,900 over y bar. 640 00:37:28,900 --> 00:37:33,070 So the expected recurrence time for state I is equal to 641 00:37:33,070 --> 00:37:35,870 pi sub I. 642 00:37:35,870 --> 00:37:39,230 If you look carefully at what I've done there, I have 643 00:37:39,230 --> 00:37:40,480 assumed that-- 644 00:37:45,590 --> 00:37:52,220 well, when I did it, I was assuming it was an ergodic 645 00:37:52,220 --> 00:37:53,690 Markov chain. 646 00:37:53,690 --> 00:37:56,080 I don't think I have to assume that. 647 00:37:56,080 --> 00:38:01,790 I think it can be periodic, and this is still true. 648 00:38:01,790 --> 00:38:03,770 You can sort that out for yourselves. 649 00:38:12,240 --> 00:38:16,110 That is the first slide I'll use which says, whenever you 650 00:38:16,110 --> 00:38:19,800 see a problem trying to prove something about Markov chains 651 00:38:19,800 --> 00:38:22,190 and you don't know how to prove it right away, think 652 00:38:22,190 --> 00:38:24,540 about renewal theory. 653 00:38:24,540 --> 00:38:26,930 The other one will say, whenever you're thinking about 654 00:38:26,930 --> 00:38:30,040 a problem in renewal theory, and you don't see how to deal 655 00:38:30,040 --> 00:38:33,730 with it immediately, think about Markov chains. 656 00:38:33,730 --> 00:38:36,690 You can go back and forth between the two of them. 657 00:38:36,690 --> 00:38:38,060 It's just like when we were dealing 658 00:38:38,060 --> 00:38:40,240 with Poisson processes. 659 00:38:40,240 --> 00:38:43,280 Again, in terms of solving problems with Poisson 660 00:38:43,280 --> 00:38:48,310 processes, what was most useful? 661 00:38:48,310 --> 00:38:50,650 It was this idea that you could look at a Poisson 662 00:38:50,650 --> 00:38:53,960 process in three different ways. 663 00:38:53,960 --> 00:38:58,350 You could look at it as a sum of exponential [INAUDIBLE] 664 00:38:58,350 --> 00:38:59,700 arrival times. 665 00:38:59,700 --> 00:39:01,450 You could look at it as somebody 666 00:39:01,450 --> 00:39:04,750 throwing darts on a line. 667 00:39:04,750 --> 00:39:09,460 And you could look at it as a Bernoulli process which is 668 00:39:09,460 --> 00:39:14,150 shrunk down to time zero with more and more arrivals. 669 00:39:14,150 --> 00:39:17,920 By being able to look at each of the three ways, you can 670 00:39:17,920 --> 00:39:20,760 solve pieces of the problem using whichever one of these 671 00:39:20,760 --> 00:39:22,340 things is most convenient. 672 00:39:22,340 --> 00:39:23,590 This is the same thing too. 673 00:39:23,590 --> 00:39:29,740 For renewal processes and Markov chains, you can go back 674 00:39:29,740 --> 00:39:34,050 and forth between what you know about each of them, and 675 00:39:34,050 --> 00:39:35,520 find things about the other. 676 00:39:35,520 --> 00:39:39,530 So it's a useful thing. 677 00:39:39,530 --> 00:39:42,210 Expected number of renewals. 678 00:39:42,210 --> 00:39:46,920 Expected number of renewals is so important in renewal theory 679 00:39:46,920 --> 00:39:50,810 that most people call it m of t, which is the expected 680 00:39:50,810 --> 00:39:53,020 value, of n of t. 681 00:39:53,020 --> 00:39:56,920 n of t, by definition, is the number of renewals that have 682 00:39:56,920 --> 00:39:58,820 occurred by time t. 683 00:39:58,820 --> 00:40:02,610 The elementary renewal theorem says the limit, as t goes to 684 00:40:02,610 --> 00:40:06,780 infinity, of expected value of n of t over t is equal to 1 685 00:40:06,780 --> 00:40:08,030 over x bar. 686 00:40:10,280 --> 00:40:12,100 Now, what happens here? 687 00:40:12,100 --> 00:40:14,480 That's a very nice limits theorem. 688 00:40:14,480 --> 00:40:17,660 But if you look at trying to calculate the expected value 689 00:40:17,660 --> 00:40:23,420 of n of t for finite t, there are situations where you get a 690 00:40:23,420 --> 00:40:26,790 real bloody mess. 691 00:40:26,790 --> 00:40:30,110 And one example of that is supposedly in our arrival 692 00:40:30,110 --> 00:40:33,330 interval, is 1 or the square root of 2. 693 00:40:35,920 --> 00:40:40,350 Now, these are not rationally related. 694 00:40:40,350 --> 00:40:43,560 So you start looking at the times at which 695 00:40:43,560 --> 00:40:45,490 renewals can occur. 696 00:40:45,490 --> 00:40:49,330 And it's any integer times 1, plus any integer times the 697 00:40:49,330 --> 00:40:51,770 square root of 2. 698 00:40:51,770 --> 00:40:56,860 So the number of possibilities within a particular range is 699 00:40:56,860 --> 00:40:59,870 growing with the square of that range. 700 00:40:59,870 --> 00:41:04,040 So what you find, as t gets very large, is possible 701 00:41:04,040 --> 00:41:08,140 arrival instance are getting more and more dense. 702 00:41:08,140 --> 00:41:10,180 There are more and more times when possible 703 00:41:10,180 --> 00:41:12,100 arrivals can occur. 704 00:41:12,100 --> 00:41:15,600 There's less and less structure to the time between 705 00:41:15,600 --> 00:41:16,850 those possible arrivals. 706 00:41:19,800 --> 00:41:24,900 And the magnitude of how much the jump is at that possible 707 00:41:24,900 --> 00:41:28,440 time, there's no nice structure to that. 708 00:41:28,440 --> 00:41:32,960 Sometimes it's big, sometime it's little. 709 00:41:32,960 --> 00:41:35,730 So m of t is going to look like an 710 00:41:35,730 --> 00:41:39,390 enormously jagged function. 711 00:41:39,390 --> 00:41:44,450 When you get out to some large t, it's increasing. 712 00:41:50,810 --> 00:41:58,310 And you know by the elementary renewal theory that this is 713 00:41:58,310 --> 00:42:02,910 going to get close to a straight line here. 714 00:42:02,910 --> 00:42:05,950 m of t over t is going to be constant. 715 00:42:05,950 --> 00:42:09,650 But you have no idea of what the fine structure of this is. 716 00:42:09,650 --> 00:42:13,760 The fine structure can be extraordinarily complicated. 717 00:42:13,760 --> 00:42:18,200 And this bothered people, of course, because it's always 718 00:42:18,200 --> 00:42:21,480 bothersome when you start out with a problem that looks very 719 00:42:21,480 --> 00:42:26,290 simple, and you try to ask a very simple question about it. 720 00:42:26,290 --> 00:42:30,110 And you get an enormously complicated answer. 721 00:42:30,110 --> 00:42:33,050 So an enormous amount of work has been done on this problem, 722 00:42:33,050 --> 00:42:35,440 much more than it's worth. 723 00:42:35,440 --> 00:42:39,940 But we can't ignore it all because it impacts on a lot of 724 00:42:39,940 --> 00:42:42,290 other things. 725 00:42:42,290 --> 00:42:48,630 So if we forget about this kind of situation here, where 726 00:42:48,630 --> 00:42:53,610 the inter-arrival interval is either 1 or something which is 727 00:42:53,610 --> 00:42:54,860 irrational. 728 00:42:58,560 --> 00:43:02,670 If you look at an inter-arrival interval, which 729 00:43:02,670 --> 00:43:07,750 is continuous where you have a probability density. 730 00:43:07,750 --> 00:43:11,360 And as a probability density is very nicely defined, then 731 00:43:11,360 --> 00:43:14,610 you can do things much more easily. 732 00:43:14,610 --> 00:43:19,310 And what you do to do that is you invent something called 733 00:43:19,310 --> 00:43:21,360 the renewal equation. 734 00:43:21,360 --> 00:43:23,160 Or else you look in a textbook and you 735 00:43:23,160 --> 00:43:25,670 find the renewal equation. 736 00:43:25,670 --> 00:43:27,140 And you see how that's derived. 737 00:43:27,140 --> 00:43:27,840 It's not hard. 738 00:43:27,840 --> 00:43:29,250 I'm not going to go through it. 739 00:43:29,250 --> 00:43:32,220 Because it has very little to do with what we're trying to 740 00:43:32,220 --> 00:43:33,450 accomplish here. 741 00:43:33,450 --> 00:43:37,060 But what the renewal equation says is that the expected 742 00:43:37,060 --> 00:43:42,220 number of renewals up until time t satisfies the equation. 743 00:43:48,550 --> 00:43:56,096 But it's the probability that x is less than or equal to t. 744 00:43:56,096 --> 00:44:00,780 Plus a convolution here of m of t minus x 745 00:44:00,780 --> 00:44:03,090 times d f of x of x. 746 00:44:03,090 --> 00:44:05,900 If you state this in terms of densities, it's easier to make 747 00:44:05,900 --> 00:44:07,260 sense out of it. 748 00:44:07,260 --> 00:44:12,560 It's the interval from zero to t, of 1 plus m of t minus x, 749 00:44:12,560 --> 00:44:18,020 all times the density of x dx This first term here just goes 750 00:44:18,020 --> 00:44:22,660 to 1 after t gets very large. 751 00:44:22,660 --> 00:44:23,820 So it's not very important. 752 00:44:23,820 --> 00:44:27,860 It's a transient term The important part is this 753 00:44:27,860 --> 00:44:33,820 convolution here, which tells you how m of t is increasing. 754 00:44:33,820 --> 00:44:42,910 You look at that, and you say, that looks very familiar. 755 00:44:42,910 --> 00:44:45,680 For electrical engineering students who have ever studied 756 00:44:45,680 --> 00:44:50,580 linear systems, that kind of equation is what you spend 757 00:44:50,580 --> 00:44:52,020 your life studying. 758 00:44:52,020 --> 00:44:55,170 At least it's what I used to spend my life studying back 759 00:44:55,170 --> 00:44:57,550 when they didn't know so much about electric engineering. 760 00:44:57,550 --> 00:45:00,100 Now, there are too many things to learn. 761 00:45:00,100 --> 00:45:02,600 So you might not know very much about this. 762 00:45:02,600 --> 00:45:07,030 But this is a very common linear equation in m of t. 763 00:45:07,030 --> 00:45:08,720 It's a differential. 764 00:45:08,720 --> 00:45:11,600 It's an integral equation from which you can find 765 00:45:11,600 --> 00:45:13,460 out what m of t is. 766 00:45:13,460 --> 00:45:18,010 You can think conceptually of starting out at m of 767 00:45:18,010 --> 00:45:19,880 0, which you know. 768 00:45:19,880 --> 00:45:21,350 And then you use this equation to 769 00:45:21,350 --> 00:45:23,370 build yourself up gradually. 770 00:45:23,370 --> 00:45:25,210 So you started m of t equals 0. 771 00:45:25,210 --> 00:45:28,730 Then you look at m of t with epsilon, where you get this 772 00:45:28,730 --> 00:45:29,880 term here a little bit. 773 00:45:29,880 --> 00:45:33,670 And you break this all up into intervals. 774 00:45:33,670 --> 00:45:35,680 And pretty soon, you find something very messy 775 00:45:35,680 --> 00:45:39,980 happening, which is why people were interested in it. 776 00:45:39,980 --> 00:45:45,950 But it can be solved if this density has a 777 00:45:45,950 --> 00:45:47,200 rational or plus transform. 778 00:45:49,830 --> 00:45:53,900 Now I don't care about how to solve that equation. 779 00:45:53,900 --> 00:45:56,410 That has nothing to do with the rest of the course. 780 00:45:56,410 --> 00:45:59,150 But the solution has the following form. 781 00:45:59,150 --> 00:46:04,170 The expected value of n of t is going up with 782 00:46:04,170 --> 00:46:08,610 t as 1 over x bar. 783 00:46:08,610 --> 00:46:11,400 I mean, you know it has to be doing that. 784 00:46:11,400 --> 00:46:15,600 Because we know from the elementary renewal theorem 785 00:46:15,600 --> 00:46:19,770 that eventually the expected value of n of t over t has to 786 00:46:19,770 --> 00:46:21,580 look like 1 over x bar. 787 00:46:21,580 --> 00:46:24,950 So that's that term here, m of t over t looks 788 00:46:24,950 --> 00:46:27,510 like 1 over x bar. 789 00:46:27,510 --> 00:46:30,890 There's this next term, which is a constant term, which is 790 00:46:30,890 --> 00:46:38,720 sigma squared of x divided by 2 times x bar squared. 791 00:46:38,720 --> 00:46:43,420 That looks sort of satisfying because it's dimensionless. 792 00:46:43,420 --> 00:46:46,980 Minus 1/2, plus some function here which is just a 793 00:46:46,980 --> 00:46:50,680 transient, which goes away as t gets large. 794 00:46:50,680 --> 00:46:54,030 Now you have to go through some work to get this. 795 00:46:54,030 --> 00:46:58,780 But it's worthwhile to try to interpret what this is saying 796 00:46:58,780 --> 00:47:00,660 a little bit. 797 00:47:00,660 --> 00:47:04,700 This epsilon of t, it's all this mess we 798 00:47:04,700 --> 00:47:06,730 were visualizing here. 799 00:47:06,730 --> 00:47:11,000 Except of course, this result doesn't apply to 800 00:47:11,000 --> 00:47:12,700 messy things like this. 801 00:47:12,700 --> 00:47:17,970 That only applies to those very simple functions that 802 00:47:17,970 --> 00:47:20,310 circuit theory people used to study. 803 00:47:20,310 --> 00:47:23,520 Because they were relevant for inductors and capacitors and 804 00:47:23,520 --> 00:47:25,665 resistors and that kind of stuff. 805 00:47:28,290 --> 00:47:30,640 So we had this most important term. 806 00:47:30,640 --> 00:47:33,810 We have this term, which asymptotically goes away. 807 00:47:33,810 --> 00:47:39,240 And we have this term here, which looks rather strange. 808 00:47:39,240 --> 00:47:46,860 Because what this is saying is that this asymptotic form 809 00:47:46,860 --> 00:47:59,930 here, m of t, as a function of t, has this slope here, which 810 00:47:59,930 --> 00:48:03,922 is 1 over x bar as the slope. 811 00:48:03,922 --> 00:48:06,490 Then it has something added to it. 812 00:48:10,390 --> 00:48:15,230 This is 1 over x bar plus some constant. 813 00:48:15,230 --> 00:48:19,460 And in this case, it's sigma squared over 2 x bar 814 00:48:19,460 --> 00:48:21,270 squared minus 1/2. 815 00:48:25,090 --> 00:48:28,960 If you notice what that sigma squared over 2 x bar squared 816 00:48:28,960 --> 00:48:32,620 minus 1/2 is for an exponential random 817 00:48:32,620 --> 00:48:36,150 variable, it's zero. 818 00:48:36,150 --> 00:48:40,420 So for an exponential random variable which has no memory, 819 00:48:40,420 --> 00:48:44,060 you're right back to this curve here, which makes a 820 00:48:44,060 --> 00:48:45,270 certain amount of sense. 821 00:48:45,270 --> 00:48:50,090 Because this is some kind of transient at zero, which says 822 00:48:50,090 --> 00:48:51,645 where you start out makes a difference. 823 00:48:54,390 --> 00:48:58,000 Now, if you look at a-- 824 00:48:58,000 --> 00:49:02,380 I mean, I'm using this answer for simple random variables 825 00:49:02,380 --> 00:49:05,460 that I can understand just to get some insight about this. 826 00:49:05,460 --> 00:49:12,560 But suppose you look at a random variable, which is 827 00:49:12,560 --> 00:49:13,810 deterministic. 828 00:49:16,443 --> 00:49:23,170 x equals 1 with probability 1. 829 00:49:23,170 --> 00:49:25,200 What are these renewals look like then? 830 00:49:29,440 --> 00:49:30,250 Start out here. 831 00:49:30,250 --> 00:49:32,140 No renewals for a while. 832 00:49:32,140 --> 00:49:33,440 Then you go up here. 833 00:49:39,290 --> 00:49:41,313 You're always underneath this curve here. 834 00:49:43,980 --> 00:49:47,340 You look at a very heavy tailed distribution 835 00:49:47,340 --> 00:49:51,770 like this thing we-- 836 00:49:51,770 --> 00:49:53,430 you remember this distribution? 837 00:49:53,430 --> 00:49:58,280 Where you have x is equal to epsilon with probability 1 838 00:49:58,280 --> 00:50:01,650 minus epsilon, and that's equal to 1 over epsilon with 839 00:50:01,650 --> 00:50:02,820 probability epsilon. 840 00:50:02,820 --> 00:50:11,660 So that the sample functions look like a whole bunch of 841 00:50:11,660 --> 00:50:12,910 very quick. 842 00:50:16,040 --> 00:50:18,370 And then there's this very long one. 843 00:50:18,370 --> 00:50:21,910 And then a lot of little quick ones. 844 00:50:21,910 --> 00:50:25,070 We can look at what that's doing as far as expected value 845 00:50:25,070 --> 00:50:26,800 of n of t is concerned. 846 00:50:26,800 --> 00:50:29,020 You're going to get an enormous number of arrivals. 847 00:50:29,020 --> 00:50:31,360 right at the beginning. 848 00:50:31,360 --> 00:50:33,770 And then you're going to go for this long period of time 849 00:50:33,770 --> 00:50:35,160 with nothing. 850 00:50:35,160 --> 00:50:40,010 So you're going to have this term here, which is sticking-- 851 00:50:48,119 --> 00:50:50,190 let's put you way up here. 852 00:50:50,190 --> 00:50:52,080 OK? 853 00:50:52,080 --> 00:50:55,510 So that now you're starting out because of this transient 854 00:50:55,510 --> 00:50:59,430 that you're starting out at a particular instant when you 855 00:50:59,430 --> 00:51:02,810 don't know whether you're going to get an epsilon or a 1 856 00:51:02,810 --> 00:51:03,710 minus epsilon. 857 00:51:03,710 --> 00:51:07,060 You're not in one of these periods where you're waiting 858 00:51:07,060 --> 00:51:10,095 for this 1/epsilon period to end. 859 00:51:12,870 --> 00:51:15,610 Then you get that term there. 860 00:51:15,610 --> 00:51:20,950 Then you get this epsilon of t, which is just a term that 861 00:51:20,950 --> 00:51:22,040 goes with y. 862 00:51:22,040 --> 00:51:25,380 OK so that's what this formula is telling you. 863 00:51:29,000 --> 00:51:30,280 Want to talk a little bit about 864 00:51:30,280 --> 00:51:32,380 Blackwell's theorem also. 865 00:51:32,380 --> 00:51:34,130 Mostly talking about things today that we're 866 00:51:34,130 --> 00:51:35,380 not going to prove. 867 00:51:38,850 --> 00:51:42,980 I mean, it's not necessary to prove everything in your life. 868 00:51:42,980 --> 00:51:45,890 It's important to prove those things which really give you 869 00:51:45,890 --> 00:51:48,770 insight about the result. 870 00:51:48,770 --> 00:51:50,820 I'm actually going to prove one form of Blackwell's 871 00:51:50,820 --> 00:51:52,680 theorem today. 872 00:51:52,680 --> 00:51:54,830 And I'll prove it-- 873 00:51:54,830 --> 00:51:56,730 guess how I'm going to prove it? 874 00:51:56,730 --> 00:52:00,580 I mean, the theme of the lecture today is if something 875 00:52:00,580 --> 00:52:03,780 is puzzling in renewal theory, use Markov chain theory. 876 00:52:03,780 --> 00:52:05,490 That's what I'm going to do. 877 00:52:05,490 --> 00:52:06,440 OK. 878 00:52:06,440 --> 00:52:09,420 What does Blackwell's theorem say? 879 00:52:09,420 --> 00:52:14,010 It says that the expected renewal rate for large t is 1 880 00:52:14,010 --> 00:52:16,282 over x bar. 881 00:52:16,282 --> 00:52:21,150 OK, it says that if I look at a little tiny increment here 882 00:52:21,150 --> 00:52:33,680 instead of this crazy curve here, if I look way out after 883 00:52:33,680 --> 00:52:41,430 years and years have gone by, it says that I'm going to be 884 00:52:41,430 --> 00:52:48,580 increasing at a rate which is very, very close to nothing. 885 00:52:48,580 --> 00:52:51,530 I might be lifted up a little bit or I might be lifted down 886 00:52:51,530 --> 00:52:55,170 a little bit, but the amount of change in some very tiny 887 00:52:55,170 --> 00:53:03,710 increment here of t to t plus epsilon. 888 00:53:03,710 --> 00:53:07,910 Blackwell's theorem tries to say that the expected change 889 00:53:07,910 --> 00:53:13,610 in this very tiny increment here of size epsilon is equal 890 00:53:13,610 --> 00:53:18,556 to 1 over x bar times epsilon. 891 00:53:18,556 --> 00:53:30,500 The expectation equals epsilon over expected value of x. 892 00:53:30,500 --> 00:53:32,310 OK? 893 00:53:32,310 --> 00:53:35,110 So it's saying what the elementary renewal theorem 894 00:53:35,110 --> 00:53:37,270 says, but something a lot more. 895 00:53:37,270 --> 00:53:40,600 The elementary renewal theorem says you take n of t, you 896 00:53:40,600 --> 00:53:42,220 divide it by t. 897 00:53:42,220 --> 00:53:45,780 When you divide it by t, you lose all the structure. 898 00:53:45,780 --> 00:53:49,620 That's why the elementary renewal theorem is a fairly 899 00:53:49,620 --> 00:53:51,810 simple statement to understand. 900 00:53:51,810 --> 00:53:54,290 Blackwell's theorem is not getting rid of all the 901 00:53:54,290 --> 00:53:58,250 structure, he's just looking at what happens in this very 902 00:53:58,250 --> 00:54:01,730 tiny increment here and saying, it 903 00:54:01,730 --> 00:54:04,120 behaves in this way. 904 00:54:04,120 --> 00:54:11,410 Now, you think about this and you say, that's not possible. 905 00:54:11,410 --> 00:54:13,510 And it's not possible because I've only stated half of 906 00:54:13,510 --> 00:54:16,335 Blackwell's theorem. 907 00:54:16,335 --> 00:54:20,680 The other part of Blackwell's theorem says if you have a 908 00:54:20,680 --> 00:54:25,670 process that can only take jumps at, say, integer times, 909 00:54:25,670 --> 00:54:30,230 then this can only change at integer times. 910 00:54:30,230 --> 00:54:33,070 And since it can only change at integer time, I can't look 911 00:54:33,070 --> 00:54:35,470 at things very close too, I can't look at 912 00:54:35,470 --> 00:54:36,960 epsilon very small. 913 00:54:36,960 --> 00:54:40,530 I can only look at intervals which are multiples of that 914 00:54:40,530 --> 00:54:42,060 change time. 915 00:54:42,060 --> 00:54:42,470 OK. 916 00:54:42,470 --> 00:54:47,500 So that's what he's trying to say. 917 00:54:50,390 --> 00:54:57,990 But the other thing that he's not saying, when I look at 918 00:54:57,990 --> 00:55:03,900 this very tiny interval here between t and t plus epsilon, 919 00:55:03,900 --> 00:55:10,916 it looks like he's saying that m of t has a density and that 920 00:55:10,916 --> 00:55:13,950 this density is 1 over x bar. 921 00:55:13,950 --> 00:55:16,710 And it can't have a density, either. 922 00:55:16,710 --> 00:55:21,630 If I had any discrete random variable at all, that discrete 923 00:55:21,630 --> 00:55:26,190 random variable can only take jumps at discrete times. 924 00:55:26,190 --> 00:55:28,990 So you can never have a density here. 925 00:55:28,990 --> 00:55:32,260 If you have a density to start with, then maybe you have a 926 00:55:32,260 --> 00:55:34,840 density after you're through. 927 00:55:34,840 --> 00:55:37,080 But you can't claim it. 928 00:55:37,080 --> 00:55:39,970 So all you can claim is that for very small intervals, you 929 00:55:39,970 --> 00:55:41,070 have this kind of change. 930 00:55:41,070 --> 00:55:44,590 You'll see that that's exactly what his theorem says. 931 00:55:44,590 --> 00:55:50,090 But when you make this distinction between densities 932 00:55:50,090 --> 00:55:55,890 and discrete, we still haven't captured the whole thing. 933 00:55:55,890 --> 00:56:08,410 Because if the interarrival interval is an integer random 934 00:56:08,410 --> 00:56:11,500 variable, namely it can only change at integer times, then 935 00:56:11,500 --> 00:56:13,120 you know what has to happen here. 936 00:56:13,120 --> 00:56:16,840 You can only have changes at integer times. 937 00:56:16,840 --> 00:56:21,800 You can generalize that a little bit by saying if every 938 00:56:21,800 --> 00:56:29,950 possible value of the inter-renewal interval is a 939 00:56:29,950 --> 00:56:33,280 multiple of some constant, then you just scale the 940 00:56:33,280 --> 00:56:35,840 integers to be less or greater. 941 00:56:35,840 --> 00:56:38,500 So the same thing happens. 942 00:56:38,500 --> 00:56:42,010 When you have these random variables like we'll take one 943 00:56:42,010 --> 00:56:47,080 value 1 or value square root of 2, then that's where this 944 00:56:47,080 --> 00:56:51,060 thing gets very ugly and nothing very nice happens. 945 00:56:51,060 --> 00:56:55,500 So Blackwell said fundamentally, there are two 946 00:56:55,500 --> 00:56:58,310 kinds of distribution functions-- 947 00:56:58,310 --> 00:56:59,560 arithmetic and non-arithmetic. 948 00:57:02,710 --> 00:57:04,800 I would say there are two kinds-- 949 00:57:04,800 --> 00:57:07,830 discrete and continuous. 950 00:57:07,830 --> 00:57:10,430 But he was a better mathematician than that and he 951 00:57:10,430 --> 00:57:13,090 thought this problem through more. 952 00:57:13,090 --> 00:57:17,190 So he knew that he wanted to lump all of the non-arithmetic 953 00:57:17,190 --> 00:57:19,830 things together. 954 00:57:19,830 --> 00:57:24,530 A random variable has an arithmetic distribution if its 955 00:57:24,530 --> 00:57:28,650 set of possible sample values are integer multiples of some 956 00:57:28,650 --> 00:57:30,610 number, say lambda. 957 00:57:30,610 --> 00:57:35,690 In other words, if it's an integer valued distribution 958 00:57:35,690 --> 00:57:37,190 with some scaling on it. 959 00:57:37,190 --> 00:57:39,420 That's what he's saying there. 960 00:57:39,420 --> 00:57:43,810 All the values are integers, but you can scale the integers 961 00:57:43,810 --> 00:57:48,080 bigger or less by multiplying them by some number lambda. 962 00:57:48,080 --> 00:57:51,450 And the largest such choice of lambda is called the "span of 963 00:57:51,450 --> 00:57:55,690 the distribution." So that when you look at m of t, what 964 00:57:55,690 --> 00:57:59,280 you're going to find is when t gets larger, only going to get 965 00:57:59,280 --> 00:58:03,590 changes at this span value and nothing else. 966 00:58:03,590 --> 00:58:08,660 So each time you have an integer times lambda, you will 967 00:58:08,660 --> 00:58:09,680 get a jump. 968 00:58:09,680 --> 00:58:12,450 And each time you don't have an integer times lambda, it 969 00:58:12,450 --> 00:58:15,000 has to stay constant. 970 00:58:15,000 --> 00:58:17,620 OK. 971 00:58:17,620 --> 00:58:22,740 So if x is arithmetic with span lambda greater than 0, 972 00:58:22,740 --> 00:58:28,130 then every sum of random variables has to be arithmetic 973 00:58:28,130 --> 00:58:34,260 with a span either lambda or an integer multiple of lambda. 974 00:58:34,260 --> 00:58:38,340 So n of t can increase only at multiples of lambda. 975 00:58:38,340 --> 00:58:43,670 If you have a non-arithmetic discrete distribution like 1 976 00:58:43,670 --> 00:58:48,820 and pi, the points at which n of t can increase become dense 977 00:58:48,820 --> 00:58:50,240 as t approaches infinity. 978 00:58:50,240 --> 00:58:53,810 Well, what we're doing here is separating life into three 979 00:58:53,810 --> 00:58:56,040 different kinds of things. 980 00:58:56,040 --> 00:58:58,990 Into arithmetic distributions, which are like integer 981 00:58:58,990 --> 00:59:00,470 distributions. 982 00:59:00,470 --> 00:59:05,120 These awful things which are like two possible values as 983 00:59:05,120 --> 00:59:10,260 discrete, but they take values which are not rational 984 00:59:10,260 --> 00:59:11,700 compared with each other. 985 00:59:11,700 --> 00:59:15,030 Points at which n of t can increase become dense. 986 00:59:15,030 --> 00:59:18,850 And finally, the third one is if you have a density and then 987 00:59:18,850 --> 00:59:21,260 you have something very nice again. 988 00:59:21,260 --> 00:59:27,000 So what Blackwell's theorem says is that limit as n goes 989 00:59:27,000 --> 00:59:33,120 to infinity of m of t plus lambda minus m of t is equal 990 00:59:33,120 --> 00:59:37,580 to lambda divided by x bar. 991 00:59:37,580 --> 00:59:38,640 OK? 992 00:59:38,640 --> 00:59:42,120 This is if you have an arithmetic x 993 00:59:42,120 --> 00:59:44,000 and a span of lambda. 994 00:59:44,000 --> 00:59:46,130 This is what we were saying should be the thing that 995 00:59:46,130 --> 00:59:51,020 happens and Blackwell proved that that is what happens. 996 00:59:51,020 --> 00:59:56,260 And he said that as t gets very large, in fact, what 997 00:59:56,260 --> 01:00:01,290 happens here is that this becomes very, very regular. 998 01:00:01,290 --> 01:00:04,320 Every span you get a little jump which is equal to the 999 01:00:04,320 --> 01:00:08,650 span size times the expected increase. 1000 01:00:08,650 --> 01:00:12,220 And then you go level for that span interval and you go up a 1001 01:00:12,220 --> 01:00:13,470 little more. 1002 01:00:18,220 --> 01:00:24,210 So that what you get in a limit is a staircase function. 1003 01:00:31,010 --> 01:00:35,162 And this is lambda here. 1004 01:00:35,162 --> 01:00:41,231 And this is lambda times 1 over x bar here. 1005 01:00:41,231 --> 01:00:42,830 OK? 1006 01:00:42,830 --> 01:00:48,220 And this might have this added or subtracted thing that we 1007 01:00:48,220 --> 01:00:50,735 were talking about before. 1008 01:00:50,735 --> 01:00:54,855 But in this interval, that's the way it behaves. 1009 01:00:54,855 --> 01:00:57,300 What he was saying for non-arithmetic random 1010 01:00:57,300 --> 01:01:01,120 variables, even for this awful thing like pi with probability 1011 01:01:01,120 --> 01:01:09,260 half and 1 with probability half, he was saying pick any 1012 01:01:09,260 --> 01:01:16,750 delta that you want, 10 to the minus 6, if the limit is t 1013 01:01:16,750 --> 01:01:21,840 goes to infinity of m of t plus delta minus m of t is 1014 01:01:21,840 --> 01:01:24,950 equal to delta over x bar. 1015 01:01:24,950 --> 01:01:30,280 At density has this behavior, but these awful things, like 1016 01:01:30,280 --> 01:01:33,410 this example we're looking at, which gets very-- 1017 01:01:33,410 --> 01:01:37,830 I mean, you have these points of increase which are getting 1018 01:01:37,830 --> 01:01:41,880 more and more dense and increases which are more and 1019 01:01:41,880 --> 01:01:42,530 more random. 1020 01:01:42,530 --> 01:01:43,217 Yes. 1021 01:01:43,217 --> 01:01:45,570 AUDIENCE: Is this true only for x bar finite or is this 1022 01:01:45,570 --> 01:01:47,858 true even if x bar is infinite? 1023 01:01:47,858 --> 01:01:49,306 PROFESSOR: If x bar is infinite? 1024 01:02:10,470 --> 01:02:13,580 I would guess it's true if x bar is infinite because then 1025 01:02:13,580 --> 01:02:15,583 you're saying that you hardly ever get increases. 1026 01:02:18,200 --> 01:02:21,040 But I have to look at it more carefully. 1027 01:02:21,040 --> 01:02:24,470 I mean, if x bar is infinite, it says that the limit is t 1028 01:02:24,470 --> 01:02:28,200 goes to infinity if this difference here goes to 0. 1029 01:02:28,200 --> 01:02:30,110 And it does. 1030 01:02:30,110 --> 01:02:34,970 So yes, it should hold true then, but I certainly wouldn't 1031 01:02:34,970 --> 01:02:37,300 know how to prove it. 1032 01:02:37,300 --> 01:02:41,440 And if you ask me what odds, I would bet you $10 to 1033 01:02:41,440 --> 01:02:42,600 $1 that it's true. 1034 01:02:42,600 --> 01:02:43,095 OK? 1035 01:02:43,095 --> 01:02:44,345 [CHUCKLE]. 1036 01:02:49,530 --> 01:02:51,020 OK. 1037 01:02:51,020 --> 01:02:56,170 Blackwell's theorem uses very difficult analysis and doesn't 1038 01:02:56,170 --> 01:02:58,460 lead to much insight. 1039 01:02:58,460 --> 01:03:00,690 If you could code that properly, I tried to read 1040 01:03:00,690 --> 01:03:02,760 proofs of Blackwell's theorem. 1041 01:03:02,760 --> 01:03:06,640 I've tried to read Blackwell's proof, and Blackwell is a guy 1042 01:03:06,640 --> 01:03:09,810 who writes extraordinarily well, I've tried to read other 1043 01:03:09,810 --> 01:03:12,520 people's proof of it, and I've never managed to get through 1044 01:03:12,520 --> 01:03:18,010 one of those proofs and say, yes, I agree with that. 1045 01:03:18,010 --> 01:03:19,510 But maybe you can go through them. 1046 01:03:19,510 --> 01:03:20,902 It's hard to know. 1047 01:03:20,902 --> 01:03:25,070 But I wouldn't recommend it to anyone 1048 01:03:25,070 --> 01:03:27,910 except my worst enemies. 1049 01:03:27,910 --> 01:03:30,850 The hard case here is this non-arithmetic but discrete 1050 01:03:30,850 --> 01:03:32,780 distributions. 1051 01:03:32,780 --> 01:03:37,730 What I'm going to do is prove it for you now as returns to a 1052 01:03:37,730 --> 01:03:40,140 given state in a Markov chain. 1053 01:03:40,140 --> 01:03:45,410 In other words, if you have a renewal interval which is 1054 01:03:45,410 --> 01:03:51,120 integer, you can only get renewals at times 1, 2, 3, 4, 1055 01:03:51,120 --> 01:03:54,450 5, up to some finite limit. 1056 01:03:54,450 --> 01:03:57,410 Then I claim you can always draw a Markov chain for this. 1057 01:03:57,410 --> 01:04:00,400 And if I can draw a Markov chain for it, then I can solve 1058 01:04:00,400 --> 01:04:01,550 the problem. 1059 01:04:01,550 --> 01:04:06,450 And the answer that I get is a surprisingly familiar result 1060 01:04:06,450 --> 01:04:11,060 for Markov chains and it proves Blackwell's theorem for 1061 01:04:11,060 --> 01:04:12,310 that special case. 1062 01:04:16,730 --> 01:04:21,320 OK, so for any renewal process with inter-renewals at a 1063 01:04:21,320 --> 01:04:25,130 finite set of integer times, there's a corresponding Markov 1064 01:04:25,130 --> 01:04:28,370 chain which models returns to state 0. 1065 01:04:28,370 --> 01:04:31,210 I'm just going to pick an arbitrary state 0. 1066 01:04:31,210 --> 01:04:34,560 I want to find the intervals between successive 1067 01:04:34,560 --> 01:04:37,680 returns to state 0. 1068 01:04:37,680 --> 01:04:41,070 And what I'm doing here in the Markov chain is I start 1069 01:04:41,070 --> 01:04:42,320 off at state 0. 1070 01:04:44,660 --> 01:04:48,790 The next thing that happens is I might come back to state 0 1071 01:04:48,790 --> 01:04:50,600 in the next interval. 1072 01:04:50,600 --> 01:04:52,680 One thinks it got passed out to you, the 1073 01:04:52,680 --> 01:04:54,840 self-loop is not there. 1074 01:04:54,840 --> 01:04:57,820 The self-loop really corresponds to returns in time 1075 01:04:57,820 --> 01:05:00,210 1, so it should be there. 1076 01:05:00,210 --> 01:05:04,800 If you don't return in time 1, then you're going to go off to 1077 01:05:04,800 --> 01:05:07,180 state 1, as we'll call it. 1078 01:05:07,180 --> 01:05:11,600 From state 1, you can return in one more time interval, 1079 01:05:11,600 --> 01:05:15,310 which means you get back in time 2. 1080 01:05:15,310 --> 01:05:19,990 Here you get back in time 1, here you get back in time 2, 1081 01:05:19,990 --> 01:05:24,380 here you get back in time 3, here you get back in time 4, 1082 01:05:24,380 --> 01:05:25,380 and so forth. 1083 01:05:25,380 --> 01:05:29,570 So you can always draw a chain like this. 1084 01:05:29,570 --> 01:05:32,030 And the transition probabilities-- 1085 01:05:32,030 --> 01:05:35,970 nice homework problem would be to show that the probability 1086 01:05:35,970 --> 01:05:43,370 of starting at i and going to i plus 1 is exactly this. 1087 01:05:43,370 --> 01:05:44,840 Why is it this? 1088 01:05:44,840 --> 01:05:49,830 Well, multiply this probability by this 1089 01:05:49,830 --> 01:05:53,230 probability by this probability by this 1090 01:05:53,230 --> 01:06:06,270 probability, and what you get is the probability that the 1091 01:06:06,270 --> 01:06:09,480 return takes five steps or more. 1092 01:06:09,480 --> 01:06:13,450 You multiply this by this by this by this. 1093 01:06:13,450 --> 01:06:17,050 Multiply this thing for different values of i, and 1094 01:06:17,050 --> 01:06:19,920 what happens? 1095 01:06:19,920 --> 01:06:21,910 Successive terms-- 1096 01:06:21,910 --> 01:06:23,470 this all cancels out. 1097 01:06:23,470 --> 01:06:27,780 So you wind up with 1 minus piece of x of i 1098 01:06:27,780 --> 01:06:29,180 when you're all done. 1099 01:06:29,180 --> 01:06:31,100 Or i plus 1. 1100 01:06:31,100 --> 01:06:32,580 OK? 1101 01:06:32,580 --> 01:06:35,850 Now here's the interesting thing. 1102 01:06:35,850 --> 01:06:41,470 For a lazy person like me, it doesn't make any difference 1103 01:06:41,470 --> 01:06:44,350 whether I've gotten this formula right or not. 1104 01:06:44,350 --> 01:06:47,430 I think I have it right, but I don't care. 1105 01:06:47,430 --> 01:06:49,380 I've only done it right because I know that some of 1106 01:06:49,380 --> 01:06:52,110 you would be worried about it and some of you would think I 1107 01:06:52,110 --> 01:06:55,580 was ignorant if I didn't show it to you. 1108 01:06:55,580 --> 01:06:58,530 But it doesn't make any difference. 1109 01:06:58,530 --> 01:07:01,900 When I get done writing down that this is the Markov chain 1110 01:07:01,900 --> 01:07:05,440 that I'm interested in, I look at this and I 1111 01:07:05,440 --> 01:07:06,480 say, this is ergodic. 1112 01:07:06,480 --> 01:07:08,980 I can get from any state here to any other state. 1113 01:07:11,890 --> 01:07:14,910 I will also assume that it's aperiodic because if it 1114 01:07:14,910 --> 01:07:18,990 weren't aperiodic, I would just leave out the states 1, 1115 01:07:18,990 --> 01:07:23,183 3, 5, and so forth for that period 2 and so forth. 1116 01:07:23,183 --> 01:07:24,050 OK. 1117 01:07:24,050 --> 01:07:26,700 So then we know that the limit is n goes to 1118 01:07:26,700 --> 01:07:30,390 infinity of p sub 00n. 1119 01:07:30,390 --> 01:07:33,530 In other words, the probability of being in state 1120 01:07:33,530 --> 01:07:37,630 0 at time n given that you were in state 0 at 1121 01:07:37,630 --> 01:07:41,540 time 0 is pi 0. 1122 01:07:41,540 --> 01:07:43,550 pi 0 I can calculate. 1123 01:07:43,550 --> 01:07:48,130 If I'm careful enough calculating this, I can also 1124 01:07:48,130 --> 01:07:51,150 calculate the steady state probabilities here. 1125 01:07:51,150 --> 01:07:57,100 Whether I'm careful here or not, I know that after I get 1126 01:07:57,100 --> 01:08:00,160 rid of the periodicity here, that I have something which is 1127 01:08:00,160 --> 01:08:01,280 ergodic here. 1128 01:08:01,280 --> 01:08:05,830 So I know I can find those pi's. 1129 01:08:05,830 --> 01:08:10,830 Now, so we know that. 1130 01:08:10,830 --> 01:08:17,990 pi 0, we already saw earlier today, is equal to 1 over the 1131 01:08:17,990 --> 01:08:22,740 expected renewal time between visits to state 0. 1132 01:08:22,740 --> 01:08:27,880 So with pi 0 equal to 1 over x bar and this equal to pi 0, 1133 01:08:27,880 --> 01:08:32,090 the expected difference between the probability 1134 01:08:32,090 --> 01:08:35,370 renewal of time n and the probability renewal of time n 1135 01:08:35,370 --> 01:08:41,080 minus 1 is exactly 1 over x bar, which is exactly what 1136 01:08:41,080 --> 01:08:42,317 Blackwell said. 1137 01:08:42,317 --> 01:08:42,714 Yes. 1138 01:08:42,714 --> 01:08:45,216 AUDIENCE: Can you please explain why this proves the 1139 01:08:45,216 --> 01:08:45,694 Blackwell theorem? 1140 01:08:45,694 --> 01:08:48,090 I don't really see it. 1141 01:08:48,090 --> 01:08:51,060 PROFESSOR: Oh, I proved the Blackwell theorem because what 1142 01:08:51,060 --> 01:09:01,109 I've shown here is that as n gets large, the probability 1143 01:09:01,109 --> 01:09:05,775 that you will be in state 0 at time t given that you're in 1144 01:09:05,775 --> 01:09:09,460 state 0 at time 0-- in other words, I'm starting off this 1145 01:09:09,460 --> 01:09:12,210 renewal process in state 0. 1146 01:09:12,210 --> 01:09:20,029 So the probability of being in state 0 at time n is really 1147 01:09:20,029 --> 01:09:23,229 exactly this thing that Blackwell was talking about. 1148 01:09:30,609 --> 01:09:32,290 OK? 1149 01:09:32,290 --> 01:09:33,800 Blackwell was saying-- 1150 01:09:33,800 --> 01:09:36,040 I mean, lambda here is 1 because I've just 1151 01:09:36,040 --> 01:09:37,540 gotten rid of that. 1152 01:09:37,540 --> 01:09:44,620 So the limit of m of t plus t plus 1 minus m of t is the 1153 01:09:44,620 --> 01:09:50,380 expectation of a renewal at time t plus 1. 1154 01:09:50,380 --> 01:09:51,380 OK? 1155 01:09:51,380 --> 01:09:53,250 And that's 1 over x bar. 1156 01:09:59,010 --> 01:10:02,770 AUDIENCE: So why is this renewal process-- why is this 1157 01:10:02,770 --> 01:10:05,670 Markov chain model exactly what we have in renewal? 1158 01:10:05,670 --> 01:10:09,370 So you're claiming that we have a renewal if and only if 1159 01:10:09,370 --> 01:10:11,770 we return to state 0 in this Markov chain. 1160 01:10:11,770 --> 01:10:12,265 PROFESSOR: Yeah. 1161 01:10:12,265 --> 01:10:13,255 AUDIENCE: So that's the thing I don't see. 1162 01:10:13,255 --> 01:10:15,525 Is it supposed to be obvious? 1163 01:10:15,525 --> 01:10:16,775 PROFESSOR: Oh, you don't see why that's true? 1164 01:10:19,490 --> 01:10:23,520 Let me try to do that. 1165 01:10:23,520 --> 01:10:27,520 I thought that at least was obvious, but as I found as I 1166 01:10:27,520 --> 01:10:30,840 try to develop this course, things which are obvious are 1167 01:10:30,840 --> 01:10:33,340 the things which are often not obvious. 1168 01:10:41,320 --> 01:10:45,560 If I have a random variable, let's say, which takes on the 1169 01:10:45,560 --> 01:10:49,330 value 1 with probability 1/2 and the probability 2 with 1170 01:10:49,330 --> 01:10:54,610 probability 1/2 and I use that as the inter-renewal time for 1171 01:10:54,610 --> 01:11:01,670 a renewal process, then starting off in time 0 with 1172 01:11:01,670 --> 01:11:08,180 probability 1/2, I will have a renewal in time 1 and I will 1173 01:11:08,180 --> 01:11:13,530 have a renewal in time 2 with probability 1/2 also. 1174 01:11:13,530 --> 01:11:14,210 AUDIENCE: Right. 1175 01:11:14,210 --> 01:11:17,070 PROFESSOR: That's exactly what this says if I draw it for-- 1176 01:11:29,270 --> 01:11:30,686 I don't need that. 1177 01:11:41,040 --> 01:11:41,520 AUDIENCE: I see. 1178 01:11:41,520 --> 01:11:42,000 PROFESSOR: OK? 1179 01:11:42,000 --> 01:11:42,960 AUDIENCE: OK. 1180 01:11:42,960 --> 01:11:44,880 Do you mind doing it for a slightly more complicated 1181 01:11:44,880 --> 01:11:48,240 example just so it's easier to see in full generality? 1182 01:11:48,240 --> 01:11:51,995 So it looks like [INAUDIBLE] values or something. 1183 01:11:51,995 --> 01:11:53,050 PROFESSOR: OK. 1184 01:11:53,050 --> 01:11:56,150 And then this won't be-- 1185 01:11:56,150 --> 01:11:59,150 let's make this 1/2. 1186 01:11:59,150 --> 01:12:01,490 And this 1/2. 1187 01:12:01,490 --> 01:12:02,940 OK. 1188 01:12:02,940 --> 01:12:04,190 And this-- 1189 01:12:06,230 --> 01:12:07,480 what is this going to be? 1190 01:12:11,910 --> 01:12:13,630 I mean, this has to be 1 at this point. 1191 01:12:13,630 --> 01:12:16,950 AUDIENCE: So then this would take on 1 with probability 1192 01:12:16,950 --> 01:12:20,742 1/2, 2 with probability 1/4, and 3 with probability 1193 01:12:20,742 --> 01:12:21,714 [INAUDIBLE]? 1194 01:12:21,714 --> 01:12:23,172 PROFESSOR: I think so, yes. 1195 01:12:23,172 --> 01:12:24,630 AUDIENCE: Good. 1196 01:12:24,630 --> 01:12:25,130 Thanks. 1197 01:12:25,130 --> 01:12:26,170 PROFESSOR: I mean, this is a question of whether I've 1198 01:12:26,170 --> 01:12:28,850 calculated these numbers right or not. 1199 01:12:28,850 --> 01:12:32,620 And looking at this example, I'm not at all sure I have. 1200 01:12:32,620 --> 01:12:35,050 But as I say, it doesn't make any difference. 1201 01:12:35,050 --> 01:12:40,350 I mean, so long as you buy the fact that if I don't return in 1202 01:12:40,350 --> 01:12:45,230 time 0, then I'm in some situation where it's already 1203 01:12:45,230 --> 01:12:48,030 taken me one unit of time, I'm not through, I have to 1204 01:12:48,030 --> 01:12:54,790 continue and I keep continuing and that's-- 1205 01:12:54,790 --> 01:12:56,040 OK? 1206 01:12:58,070 --> 01:12:58,610 OK. 1207 01:12:58,610 --> 01:13:00,020 And there's-- 1208 01:13:00,020 --> 01:13:01,050 oh. 1209 01:13:01,050 --> 01:13:04,250 I already explained delayed renewal processes. 1210 01:13:04,250 --> 01:13:06,360 I will explain it again. 1211 01:13:06,360 --> 01:13:10,530 A delayed renewal process is a modification of a renewal 1212 01:13:10,530 --> 01:13:16,010 process for which the first inter-renewal interval x1 has 1213 01:13:16,010 --> 01:13:19,110 a different distribution than the others. 1214 01:13:19,110 --> 01:13:23,905 And the intervals are all independent of each other. 1215 01:13:23,905 --> 01:13:28,800 So that the first interarrival period might do anything. 1216 01:13:28,800 --> 01:13:31,820 After that, they all do the same thing. 1217 01:13:31,820 --> 01:13:35,710 And the argument here is if you're looking at a limit 1218 01:13:35,710 --> 01:13:39,960 theorem that how long any-- 1219 01:13:39,960 --> 01:13:45,100 if you're looking at the limit of how many arrivals occur 1220 01:13:45,100 --> 01:13:48,210 over a very long period of time, the amount of time it 1221 01:13:48,210 --> 01:13:51,350 takes this first arrival to occur doesn't make any 1222 01:13:51,350 --> 01:13:52,360 difference. 1223 01:13:52,360 --> 01:13:54,230 It occurs at some time. 1224 01:13:54,230 --> 01:13:57,150 And after that, it gets amortized over an enormously 1225 01:13:57,150 --> 01:14:00,550 long time which is going to infinity. 1226 01:14:00,550 --> 01:14:04,030 So if it takes a year for the first arrival to occur, I look 1227 01:14:04,030 --> 01:14:06,590 at 1,000 years. 1228 01:14:06,590 --> 01:14:09,250 If it only takes me six months for the first arrival to 1229 01:14:09,250 --> 01:14:14,330 occur, well, I still look at 1,000 years, but I mean, you 1230 01:14:14,330 --> 01:14:15,510 see the point. 1231 01:14:15,510 --> 01:14:19,220 This first interval becomes unimportant compared with 1232 01:14:19,220 --> 01:14:20,910 everything else. 1233 01:14:20,910 --> 01:14:23,180 And because of that, the strong law 1234 01:14:23,180 --> 01:14:24,430 still is going to hold. 1235 01:14:33,700 --> 01:14:38,520 That says that convergencing probability also occurs. 1236 01:14:38,520 --> 01:14:44,490 All it does as far as m of t is concerned, the expected 1237 01:14:44,490 --> 01:14:48,520 value of m of t, it moves it up or moves it down, but 1238 01:14:48,520 --> 01:14:51,980 doesn't change the slope of it and so forth. 1239 01:14:54,490 --> 01:14:58,780 Even if the expected time for the first renewal is infinite. 1240 01:14:58,780 --> 01:15:03,200 And that sounds very strange, but that still is true and 1241 01:15:03,200 --> 01:15:05,280 it's true by essentially the same argument. 1242 01:15:05,280 --> 01:15:07,470 You wait until the first arrival occurs. 1243 01:15:07,470 --> 01:15:09,730 It has to occur at some point. 1244 01:15:09,730 --> 01:15:12,360 And after that, you can amortize that over as long as 1245 01:15:12,360 --> 01:15:14,500 you want, you're just looking at a limit. 1246 01:15:14,500 --> 01:15:17,040 When you look at a limit, you can take as long as you want 1247 01:15:17,040 --> 01:15:20,460 to, and you take long enough that you wash out the stuff at 1248 01:15:20,460 --> 01:15:22,060 the beginning. 1249 01:15:22,060 --> 01:15:28,400 I mean, if you told me that, I would say 1250 01:15:28,400 --> 01:15:30,950 you're waving your arms. 1251 01:15:30,950 --> 01:15:37,190 But if you read the last section of the notes and you 1252 01:15:37,190 --> 01:15:40,750 summarize it, that's exactly how you will summarize it. 1253 01:15:40,750 --> 01:15:42,000 OK.