1 00:00:00,420 --> 00:00:02,880 Independence is one of the central concepts of 2 00:00:02,880 --> 00:00:05,940 probability theory, because it allows us to build large 3 00:00:05,940 --> 00:00:08,730 models from simpler ones. 4 00:00:08,730 --> 00:00:10,410 How should we define independence in 5 00:00:10,410 --> 00:00:12,210 the continuous case? 6 00:00:12,210 --> 00:00:15,670 Our guide comes from the discrete definition. 7 00:00:15,670 --> 00:00:19,950 By analogy with the discrete case, we will say that two 8 00:00:19,950 --> 00:00:23,510 jointly continuous random variables are independent if 9 00:00:23,510 --> 00:00:29,120 the joint PDF is equal to the product of the marginal PDFs. 10 00:00:29,120 --> 00:00:32,940 We can now compare with the multiplication rule, which is 11 00:00:32,940 --> 00:00:38,520 always true as long as the density of Y is positive. 12 00:00:38,520 --> 00:00:40,030 So this is always true. 13 00:00:40,030 --> 00:00:42,780 In the case of independence, this is true. 14 00:00:42,780 --> 00:00:45,300 So in the case of independence, we must have 15 00:00:45,300 --> 00:00:50,350 that this term is equal to that term, at least whenever 16 00:00:50,350 --> 00:00:52,060 this quantity-- 17 00:00:52,060 --> 00:00:53,550 the marginal of Y-- 18 00:00:53,550 --> 00:00:55,890 is positive. 19 00:00:55,890 --> 00:01:01,130 So to restate it, independence is equivalent to having the 20 00:01:01,130 --> 00:01:06,240 conditional, given Y, be the same as the unconditional PDF 21 00:01:06,240 --> 00:01:11,080 of X. And this has to be true whenever Y has a positive 22 00:01:11,080 --> 00:01:14,580 density so that this quantity is well defined, and it also 23 00:01:14,580 --> 00:01:17,080 has to be true for all xs. 24 00:01:17,080 --> 00:01:19,070 Now, what does this really mean? 25 00:01:19,070 --> 00:01:23,140 The conditional PDF, as we have discussed, in terms of 26 00:01:23,140 --> 00:01:27,630 pictures, is a slice of the joint PDF. 27 00:01:27,630 --> 00:01:32,160 Therefore, independence is the same as requiring that all of 28 00:01:32,160 --> 00:01:37,440 the slices of the joint have the same shape, and it is the 29 00:01:37,440 --> 00:01:40,200 shape of the marginal PDF. 30 00:01:40,200 --> 00:01:43,780 For a more intuitive interpretation, no matter what 31 00:01:43,780 --> 00:01:47,090 value of Y you observe, the distribution 32 00:01:47,090 --> 00:01:49,350 of X does not change. 33 00:01:49,350 --> 00:01:52,630 In this sense, Y does not convey any information about 34 00:01:52,630 --> 00:01:57,490 X. Notice also that this definition is symmetric as far 35 00:01:57,490 --> 00:01:59,880 as X and Y are concerned. 36 00:01:59,880 --> 00:02:03,960 So by symmetry, when we have independence, it also means 37 00:02:03,960 --> 00:02:08,568 that X does not convey any information about Y, and that 38 00:02:08,568 --> 00:02:13,330 the conditional density of Y, given X, has to be the same as 39 00:02:13,330 --> 00:02:17,100 the unconditional density of Y. 40 00:02:17,100 --> 00:02:19,450 We can also define independence of multiple 41 00:02:19,450 --> 00:02:20,410 random variables. 42 00:02:20,410 --> 00:02:22,340 The definition is the obvious one. 43 00:02:22,340 --> 00:02:25,870 The joint PDF of all the random variables involved must 44 00:02:25,870 --> 00:02:29,070 be equal to the product of the marginal PDFs. 45 00:02:29,070 --> 00:02:32,220 Intuitively, what that means is that knowing the values of 46 00:02:32,220 --> 00:02:35,380 some of the random variables does not affect our beliefs 47 00:02:35,380 --> 00:02:38,360 about the remaining random variables. 48 00:02:38,360 --> 00:02:41,220 Finally, let us note some consequences of independence, 49 00:02:41,220 --> 00:02:43,850 which are identical to the corresponding properties that 50 00:02:43,850 --> 00:02:46,750 we had in the discrete case, and the proofs are also 51 00:02:46,750 --> 00:02:48,110 exactly the same. 52 00:02:48,110 --> 00:02:51,130 So the expectation of the product of independent random 53 00:02:51,130 --> 00:02:54,620 variables is the product of the expectations, the variance 54 00:02:54,620 --> 00:02:57,850 of the sum of independent random variables is the sum of 55 00:02:57,850 --> 00:03:01,250 the variances, and functions of independent random 56 00:03:01,250 --> 00:03:03,620 variables are also independent, which, in 57 00:03:03,620 --> 00:03:07,720 particular, implies, using the previous rule, that the 58 00:03:07,720 --> 00:03:10,740 expected value of a product of this kind is going to be the 59 00:03:10,740 --> 00:03:13,830 product of these expectations. 60 00:03:13,830 --> 00:03:17,970 So independence of continuous random variables is pretty 61 00:03:17,970 --> 00:03:21,350 much the same as independence of discrete random variables 62 00:03:21,350 --> 00:03:24,670 as far as mathematics are concerned, and the intuitive 63 00:03:24,670 --> 00:03:28,810 content of the independence assumption is the same as in 64 00:03:28,810 --> 00:03:29,820 the discrete case. 65 00:03:29,820 --> 00:03:31,980 One random variable does not provide any 66 00:03:31,980 --> 00:03:33,600 information about the other.