1 00:00:00,770 --> 00:00:03,550 Let us now look at some numerical examples to get a 2 00:00:03,550 --> 00:00:08,690 feel for how the central limit theorem works in practice. 3 00:00:08,690 --> 00:00:11,470 Let us look at a discrete random variable that has a 4 00:00:11,470 --> 00:00:16,550 uniform distribution in the range here from 1 to 8. 5 00:00:16,550 --> 00:00:20,360 If we add two independent random variables, drawn from 6 00:00:20,360 --> 00:00:24,450 this PMF, we obtain a random variable whose PMF is the 7 00:00:24,450 --> 00:00:27,460 convolution of this PMF with itself. 8 00:00:27,460 --> 00:00:30,380 We can even carry out this calculation by hand, and we 9 00:00:30,380 --> 00:00:32,350 get a triangular PMF. 10 00:00:32,350 --> 00:00:36,620 So this is what we get for the case where n is equal to 2. 11 00:00:36,620 --> 00:00:38,290 Now we can keep doing this. 12 00:00:38,290 --> 00:00:42,650 If we add four of these discrete uniforms, of course 13 00:00:42,650 --> 00:00:46,680 assumed independent, then we obtain a PMF that starts to 14 00:00:46,680 --> 00:00:51,490 have a shape close to that of a normal shape. 15 00:00:51,490 --> 00:00:57,160 And if you take as many as 32 of them, then the PMF of the 16 00:00:57,160 --> 00:01:04,000 sum of 32 discrete uniforms is almost identical to the shape 17 00:01:04,000 --> 00:01:09,410 that you would get if you were to draw a normal PDF. 18 00:01:09,410 --> 00:01:16,270 So with n as small as 32, we have essentially converged. 19 00:01:16,270 --> 00:01:19,530 In fact, this convergence is so good that in practice, 20 00:01:19,530 --> 00:01:23,630 quite often people use this idea to generate random 21 00:01:23,630 --> 00:01:25,870 samples of a normal random variable. 22 00:01:25,870 --> 00:01:27,300 What do you do? 23 00:01:27,300 --> 00:01:30,370 You draw 32 random samples from a discrete 24 00:01:30,370 --> 00:01:33,170 uniform, add them up. 25 00:01:33,170 --> 00:01:36,670 And what you get is a sample of essentially a 26 00:01:36,670 --> 00:01:38,979 normal random variable. 27 00:01:38,979 --> 00:01:42,229 Now in this example, things worked out well for us because 28 00:01:42,229 --> 00:01:45,789 the distribution that we started from was nicely 29 00:01:45,789 --> 00:01:49,440 symmetric and didn't have any strange features. 30 00:01:49,440 --> 00:01:53,450 Things are not always so favorable. 31 00:01:53,450 --> 00:01:57,970 Let us consider starting from a truncated geometric. 32 00:01:57,970 --> 00:02:02,910 If we add eight random variables that are independent 33 00:02:02,910 --> 00:02:06,070 and drawn from this distribution, what we obtain 34 00:02:06,070 --> 00:02:10,900 is a PMF of this form, which does not really look like a 35 00:02:10,900 --> 00:02:12,660 normal shape. 36 00:02:12,660 --> 00:02:16,070 The reason is that there's a pronounced asymmetry. 37 00:02:16,070 --> 00:02:19,920 So let us add more and more independent X's. 38 00:02:19,920 --> 00:02:24,350 If we add 16 of them, we start to get something that's a 39 00:02:24,350 --> 00:02:25,640 little closer to normal. 40 00:02:25,640 --> 00:02:28,180 But the asymmetry is still visible. 41 00:02:28,180 --> 00:02:33,660 And if we add 32 of them, we can still see some asymmetry. 42 00:02:33,660 --> 00:02:37,360 Namely, this tail here does not look exactly like this 43 00:02:37,360 --> 00:02:39,720 tail out there. 44 00:02:39,720 --> 00:02:43,110 So in this instance, it's really the asymmetry of the 45 00:02:43,110 --> 00:02:45,500 original distribution that makes it 46 00:02:45,500 --> 00:02:47,110 difficult to converge. 47 00:02:47,110 --> 00:02:51,250 And it will take larger values of n before we can get a very 48 00:02:51,250 --> 00:02:52,500 accurate approximation.