Lecture 6

Flash and JavaScript are required for this feature.

Download the video from Internet Archive.

Previous track Next track

Instructor: Vina Nguyen

Lecture Topics:
Random Variable, Discrete vs. Continuous, Probablity Mass Functions, Bernoulli Random Variable, Binomial Random Variable, Geometric Random Variable, Poisson Random Variable, Expected Value Variance and Standard Deviation

» Download English-US transcript (PDF)

The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high quality educational resources for free. To make a donation or to view additional materials from hundreds of MIT courses, visit MIT OpenCourseWare at ocw.mit.edu.

VINA NGUYEN: So can anyone tell me what a random variable is, or do you want me to define it?

AUDIENCE: Isn't it just any value within your sample space?

VINA NGUYEN: Yeah. So it's a way to represent the value you want, which can be anything. I'll write it out. So remember how we talked about sample space? Oh, wait. What's your name?

AUDIENCE: Eric.

VINA NGUYEN: Eric. OK. Let me write you in. Eric. So imagine this is your sample space. So this is your universal sample space. And what a random variable does is take any of your outcomes in the sample space and puts it on a real number line. So this could be 0, 1, 2, 3. And say you have some random value here, you're going to map it to whatever real number value line that is.

So the example that I have in here is if you say that your random variable x is the maximum of two rules, you have a sample space. So this is like your first roll. This is your second roll.

So this is your sample space. And then you have the real number line, where x is some certain event. But x is the maximum to roll. So it could be like 1, 2, 3, 4, 5, 6. So if we roll the 2 and the 3, then that would be mapped to 3. Pretty straightforward. If you have 5 and 5, this sample space maps the 5, et cetera. Does everyone understand that?

AUDIENCE: Oh, maximum roll size and pick one, and you've got to combine them?

VINA NGUYEN: Hm?

AUDIENCE: Oh, so when you meant max of two rolls, you meant the highest of the two rolls, not the sum of the rolls.

VINA NGUYEN: Yeah.

AUDIENCE: Oh.

AUDIENCE: Is it also, like, [INAUDIBLE]?

VINA NGUYEN: Oh, OK. Sorry. So this is not really a coordinate system. It's just the way that we represent it. So I'm not mapping x against y. So x is separate from this sample space. So x can be any one of these numbers, but lower case x means a certain one of these numbers. So that's the difference between capitalized X and a lower case x, which is one of the questions you guys had.

So does that make sense? OK. If you notice here, this is discrete because you can't have 1.1, 1.2, 1.3, 1.333, et cetera. So this is discrete. It's countable, and it's finite. So you can't have an infinite number in order to be discrete.

If you look on the back of the first page, I already told you that the first three are discrete. So I'm just going to ask why the fourth and fifth random variables aren't discrete, if anyone can tell me that. Anybody? Yep?

AUDIENCE: Well, the range is infinite for time.

VINA NGUYEN: Right. Yeah. Did everyone hear that? So the range is infinite, and you can't discretely count them. Yep. Exactly. But the nice thing about random variables is that we can take a continuous thing like that and make it into discrete.

So the example I have is, let's say a is some random variable that has a range from negative infinity to infinity. Does everyone understand this notation? OK. So we can't have a discrete random variable that describes this because it is continuous and infinite.

So we're going to make a function. I just call it f because sgn confuse you guys. So we're going to convert a into discrete. So we're going to make it 1 if a is greater than 0, 0 if a is equal to 0, and negative 1 if a is less than 0. So by taking a continuous sample space we've made it discrete. Does everyone see that?

So you have this level of infinity. And we've mapped it into discrete values, 0, 1, 1. So does everyone see how we can use this random variable thing to make continuous things discrete? OK.

So the thing about random variables is that they need probability mass functions, and that's just a very technical way of saying, what's the probability of x being any of these? So the way we write that is p of x. So what this is saying is the probability of this random variable x being a specific x. So the probability of x max of two rolls being specifically 1, 2, 3, 4, 5, 6.

So that would be like this. That's basically what it's saying. So the example I have for this is if we say a random variable x-- this is just random variable. x equals the number of heads obtained.

So if we have our random variable defined as x equals the number of heads obtained in a two toss sequence, the first thing we need to do is write down what our PMF is. What is our probability mass function? So what is this, essentially? Does anyone have any idea how to start?

AUDIENCE: You could draw the table of all the possible outcomes. So you could have head-head, head-tail, tail-head, tail-tail.

VINA NGUYEN: Mhm. Is that it? Oh, wait. You said--

AUDIENCE: Oh, and tail-tail.

VINA NGUYEN: Yeah. So basically, x equals 0, x equals 1, x equals 2. Does everyone see that. So this means 0 heads, which would be this. 1 should be these, and then 2, it should be that. So we're going to ask, what's the probability that x is 0?

AUDIENCE: 1/4.

VINA NGUYEN: Louder.

AUDIENCE: 1/4.

VINA NGUYEN: Yep. And 1.

AUDIENCE: 1/2.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: And in probability mass functions we also want to add 0. So my TA got me for that a lot, so make sure you remember that in college. And the example here, though, I've combined these two, which is how you're technically supposed to do it. So we would actually take this out and then put that. So it's just a simpler way of writing it. See that?

Am I going too fast? Does everyone understand? All right. So that's just an example of how to calculate PMFs. And if you notice, if you add these, you get 1. So it's easy to tell if you have this written out. 1/4 plus 1/2 plus 1/4 is 1. So make sure, if you do write it in this way, that you add it twice, because you have two separate x's.

So now we're going to talk about specific kinds of random variables. The first one is the easiest one, and that is the Bernoulli. Yeah. Bernoulli random variable.

So essentially, this is your coin toss. So you have x equals heads. Or, in more general terms, that would be success. So this is your random variable, and that's how you're defining it.

So your PMF is pretty simple. p equals the probability that you get heads, and then 1 minus p, probability that you don't. And we're going to use the real number line. So you have your sample space again. You have heads here. Tails. Even though it's not continuous, this doesn't actually have a real number line value, so we're going to make heads 1 and tails 0.

So this is x equals 1 and 0. Does everyone see that's pretty simple? There's applications for this, like whether a telephone is free or busy for someone who is a telemarketer and wants to know the probability that the person will pick up, or if a person is sick or healthy, simple things like that.

So our second one is the binomial random variable, which is basically a sequence of Bernoulli random variables. So the way this is set up is that you toss a coin N times, and then your random variable x is the number of times the heads comes up.

So a number of heads in an N task sequence. The more general way of saying that is the number of successes in N number of trials. OK. And your P, again, is the probability of success in just one try.

And that's where your Bernoulli comes in. So that's the definition. Does everyone understand how that's working? You flip once. What is it? You flip again, tails. Flip, heads. So in here, your x would be 2. That's one of the examples.

So we're going to figure out what the PMF is. Have you guys seen this before? No? OK. We're going to use a lot of concepts to figure out what this is based on what you guys already know. You already know what coin tossing is.

You guys know what multiplication rule is too, right? You have a sequence of things. You just multiply the probabilities. You know what combinations are. Does order matter in this or not?

AUDIENCE: No.

VINA NGUYEN: No. OK. Good. And then the fourth thing you just learned is random variables. Coin toss thing I will write up again. It's basically Bernoulli random variable. P, probability that you get heads, and 1 minus P is the probability that you don't.

So for the multiplication rule, if we say we toss it N times and you get K number of heads. And for my example, we'll use N equals 5, K equals 2. So what's the probability that you get K heads? Actually, how would you calculate the probability that you get K heads N times?

So you get-- so the probability of 1 is p. Just one trial. And then multiplication rule, you multiply this. That's what the star is. You do that again, but this time it's 1 minus p. 1 minus p. 1 minus p. And the only reason you can do this is because of this. So you know that.

And another way to write this is p K times, because K is the number of times you got heads. And then 1 minus p, N minus K. Does everyone see that? Because N is the total, and K is the number of heads. So you just want to take the difference.

So that's part 2. For combinations, you know that's not the only way you can get two heads and three tails. You can have that, you can have this, et cetera, et cetera. So your combinations comes in. And how many of these sequences are possible?

And we learned that's this many. So that would be 5, 2. Does everyone see that? And of course, you times it by this probability. So this is the number of times you can have this combination, where K is the number of heads and N is the number of tosses.

Does that make sense? So you combine them all together, and you get probability of x equals K. We're going to use K in this example so that differentiates it with N. Equals the number of combinations times the probability, which we got there.

Does everyone see that? Sorry. So what's our restriction on K? It can't be this, right? That doesn't work. So what does K have to be?

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Yeah. And also 0. It has an N of 0. And this is discrete again. Can you see? OK. So those are two of the major random variables that people usually start teaching with. So does everyone understand that? I'm going to erase this. Where's the eraser? Do you guys need this?

So if you just have equations like this it's kind of hard to get a feel for what your random variable is saying. So we're going to graph it. And this is called the distribution. This is your coin toss. This is the x's that it can take, so 0, 1, and this is your PMF which we calculated.

So the probability that you get 0 is 1 minus p let's say. And this could be p or something. And this can be reversed depending on what p is. So for your binomial-- so for your binomial, if I say that N equals 9 and p equals 1/2, then it's going to be symmetrical. So if you have 9, 0, it's going to look like whatever number that is. 4 and 5.

So symmetrical of p equals 1/2. But if p is less than 1/2, it's going to be skewed this way. And if p is greater than 1/2, it's going to be skewed that way. Does everyone see that it's really easy to calculate if you do have numbers. You just plug in x, plot the probability, and then you get this distribution.

So these are all called distributions. Does that makes sense? So so far we know the Bernoulli random variable and the binomial. So I'm going to tell you about the geometric random variable.

And the way the random variable is described for that is x equals the number of tosses needed for a head to come up.

AUDIENCE: Do you need this?

VINA NGUYEN: No, you can-- you can just put it there. So that x includes the toss where you get a head. So number of tosses needed for a head to come up the first time.

AUDIENCE: Why the first time?

VINA NGUYEN: It's just like if you were simulating how many times you need to do something before you get it the first shot.

AUDIENCE: Oh, OK.

VINA NGUYEN: Yeah. So if we say that K is the number of times, then how would we write that probability? So this would be K equals 5. So it's kind of like that. You have p, p1 minus p, et cetera. Probability that you don't get heads for how many times?

AUDIENCE: One.

VINA NGUYEN: Right. Which is K minus 1. And the probability that you get it once? It is just 1. Does everyone see that? So that is how you write your PMF. So the PMF for this random variable equals 1 minus p, K minus 1 p. And in this case, what can K equal? It can't be 0 this time. So it starts from 1.

AUDIENCE: Goes to K, right?

VINA NGUYEN: No, because you're defining K, right? OK. So it's countable, which is OK, which makes it still discrete, even though it does go to infinity. So that would be discrete.

And if we graph it to give you more visual understanding of what this looks like, if it only takes one time, what's the probability? p. Right. And if you graph it, it will slowly go down, like this, depending on how you choose p and what K is, et cetera. So that's just an example of what it looks like.

So applications for this could be the number of times you need to take a test before you pass if your probability is, like, 0.6, which is not good. Another example could be finding a missing item in a given search. So that could be like an airplane, where they're trying to find your luggage, where p is like 0.0001.

But that's some real life examples, because no one really cares about heads. OK. Does anyone need this? Anyone need this? OK. The fourth random variable is a little bit more complicated. Does everyone know what e is? Natural number. Two point whatever.

So now you know a third kind, geometric. And the fourth one I'm going to tell you about today is the Poisson kind of variable. So has anyone heard of this before?

AUDIENCE: I've heard of it.

VINA NGUYEN: So I'm going tell you the PMF right off the bat instead of deriving it. So the Poisson random variable is mainly used to approximate binomials. And you know what binomial RVs are anyway. So K is the number of heads in an N toss sequence.

So this works only if this lambda here is equal to Np, where N is the number of tosses and p is the probability of success. And N has to be really large, and p has to be very small. So instead of a coin toss, where you could have N equals 10 and p equals 1/2, p has to be like-- say it's 0.01, which is really small, and this could be like 1,000. They're relative to each other.

So an example of this could be if you're fixing the number of typos in a book and your probability is really small, but your N is large because N could be the number of words, which is a lot. Or another example would be the number of cars that get into accidents everyday, where N is like--

AUDIENCE: [INAUDIBLE].

VINA NGUYEN: Yeah. Where N is a lot and p is, hopefully, pretty small.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: So we're going to do a problem to help you understand. Anyone need this?

AUDIENCE: What page are we?

VINA NGUYEN: You're page-- second. Second. Does everyone see the problem? Because I really don't feel like writing it. Did everyone read it yet?

AUDIENCE: Almost.

VINA NGUYEN: Almost. It's called problem, and it's on the second page.

[LAUGHTER]

OK. Just checking. It's early for me.

AUDIENCE: Exactly one birthday, how specific? Just, like, the day? So it has to be year.

VINA NGUYEN: Yep.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: So there's only 365 days. Not 366. Just 365.

AUDIENCE: 365.

VINA NGUYEN: Round down. OK. Is everyone good? OK. So I'll summarize it. You have a party with 500 guests. What is the probability that only one guest has the same birthday as you?

So we're going to solve it using the binomial way and then the Poisson approximation. So what's your N? What's the total number that we're going to add here?

AUDIENCE: 500.

VINA NGUYEN: Oh, sorry. 500 includes yourself. So--

AUDIENCE: 499.

VINA NGUYEN: Yeah. So 499. What's K? K is your success rate.

AUDIENCE: 1.

VINA NGUYEN: Yep. And p, the probability?

AUDIENCE: 499.

VINA NGUYEN: I can't hear. I can't hear. I can't hear you guys. 1 out of--

AUDIENCE: 499.

AUDIENCE: 3.

VINA NGUYEN: No, 3.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Yeah.

AUDIENCE: Because you only have one.

VINA NGUYEN: What were you saying before, Priya? Do you understand why? OK. So this is number of days, and you only want one. So here's your problem set up. If we do it the binomial way, how do we write that? You have N, K, et cetera.

AUDIENCE: p, K.

VINA NGUYEN: We'll just plug it in.

AUDIENCE: Oh. 499. And then 365.

VINA NGUYEN: Mhm.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: 48. Right. N minus K. So what we're doing is just filling in this part. NK, PK, 1 minus p, N minus K. So this is kind of like binomial and geometric since we are just doing K equals 1.

So how do we solve that? What does this become?

AUDIENCE: 499.

[INTERPOSING VOICES]

VINA NGUYEN: What?

AUDIENCE: It's sort of a little hard.

VINA NGUYEN: Yeah. It's really hard. So you have 498.

AUDIENCE: Yeah. So it's just 499.

VINA NGUYEN: Yeah. But if you had a fair number. Yeah. This part is hard.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Anyone tell me what this is? You're not understanding? OK. This is 0.3486. So this is what you get when you do it the exact way.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Wait. Sorry. Sorry. Yeah. Wait, is it?

AUDIENCE: Yep. Oh, yeah, that's right.

VINA NGUYEN: That's right, isn't it? To calculate it.

AUDIENCE: It's high.

VINA NGUYEN: Yeah. Kind of figured.

AUDIENCE: Well, we have so many people.

VINA NGUYEN: So if we do it the Poisson way, what is this? This is our parameter.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: N is--

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Before I go on, does everyone understand this?

AUDIENCE: Yeah.

VINA NGUYEN: OK. So you have-- I wrote this kind of funny. All right. So this is our PMF. So if you plug everything in, what do you get?

AUDIENCE: [INAUDIBLE]

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Hm?

AUDIENCE: What's K again?

VINA NGUYEN: K is 1. Is that right?

AUDIENCE: Yeah.

VINA NGUYEN: So this has a lot less exponents and stuff, so it's a lot easier to calculate. So what do we get?

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: This guy.

AUDIENCE: Oh, 0.348.

VINA NGUYEN: Yep. So you get a pretty good approximation. So by using the Poisson, you can get pretty good approximation without doing all this extra calculation. So that's one of the reasons they made this random variable. Well, they didn't make it. They derived it.

AUDIENCE: So it's like an estimation variable.

VINA NGUYEN: Yeah. For this specific application. Does that make sense to everybody? Keep in mind, this only works, again, if N is large and p is small. So you can't do it if it's just a coin toss. OK.

AUDIENCE: What happens [INAUDIBLE]

VINA NGUYEN: Just calculator doesn't work. So what's this? What's the Bernoulli one?

AUDIENCE: Bernoulli is--

VINA NGUYEN: What is x? Our random variable. What is it? What are we defining it as?

AUDIENCE: Success.

VINA NGUYEN: In just one trial.

AUDIENCE: Several.

VINA NGUYEN: No, just one.

AUDIENCE: Oh, right.

VINA NGUYEN: Just one. This is N number of trials. What's this?

AUDIENCE: How many times it takes.

AUDIENCE: How long it takes. How many trials it takes in order to get success.

VINA NGUYEN: And this is including the trial that you do get success. And Poisson is what you just learned.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: For that. OK. Good. So those are the four main random variables. And I'm going to tell you what expectation and variance is. Do you guys need any of this? So even though it's a funny name like random variable, it's basically just summarizing what you guys already know-- probabilities, sample space, et cetera. It's just a very short way of writing all that.

Anyone need this? This? Still need it? OK. How many of you guys have taken statistics or any kind of statistics?

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Kind of? Do you guys know what mean is? OK. Do you know what variance is?

AUDIENCE: No.

VINA NGUYEN: No? OK. So I'll skim over mean then. So in probability, mean is actually called expectation, which is your expected value. And the reason for that is because a mean kind of implies a bunch of experiments and you find the mean. But in probability, you might only do one trial. So this is your expected value, even if it is the same thing, mathematically, as your average.

And then variance is just another way of describing how spread out your data is from that mean.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Yeah.

AUDIENCE: Nice.

[SIDE CONVERSATION]

VINA NGUYEN: And like you've mentioned, you probably already know what standard deviation is. That's equal to the square root of variance.

AUDIENCE: Oh.

VINA NGUYEN: Yeah. So this is kind of easy because you guys sound like you already know this.

AUDIENCE: Not really.

VINA NGUYEN: No? I will keep on going.

AUDIENCE: Standard. Is that STD?

VINA NGUYEN: They're not all caps.

[LAUGHTER]

Is that better? All right.

[SIDE CONVERSATION]

So I'll run through this example to show you what expectation is. So you have two independent coin tosses, and the probability that you can heads is 3/4, and your random variable is x equals the number of heads obtained. So what kind of random variable is that?

AUDIENCE: Bernoulli?

AUDIENCE: Binomial.

AUDIENCE: Bernoulli.

AUDIENCE: Binomial.

AUDIENCE: Binomial.

VINA NGUYEN: Yes. Why? Because there's two tosses, right? It's not just one.

AUDIENCE: Did you just make up 3/4?

VINA NGUYEN: Yeah. Just make it up.

AUDIENCE: Just wanted to be sure.

VINA NGUYEN: Yeah. It could be biased. It was more interesting than 1/2. OK. So like I said, to describe a random variable you need probabilities. Otherwise, this doesn't matter. So what is your PMF?

AUDIENCE: K could be 0.

VINA NGUYEN: And like I said, you have to add the otherwise 0. So what is 0? The probability that your x equals 0? Anybody?

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Yeah. So we're just going to write that like this. 1? Probability that you just get 1?

AUDIENCE: 3/4.

AUDIENCE: 3/4.

VINA NGUYEN: But there's two different ways, right? So this is your tail-tail. This is your head-tail. And this one?

AUDIENCE: 3/4.

VINA NGUYEN: So the reason is 3/4, like you said. So we have two different probabilities for that. Otherwise, the 1/2 thing would throw us off. So expectation, like I've said, is your average. And the way we write it in probability is e of your random variable x. So x is your random variable. This means mean.

So how would you figure out the mean of that? If we're just going to flip it twice, what is the average that we're expecting?

AUDIENCE: [INAUDIBLE]

AUDIENCE: Could you repeat that?

VINA NGUYEN: Hm?

AUDIENCE: Could you just repeat that again?

VINA NGUYEN: Oh, yeah. So if you have your random variable and then probabilities of each event, how are you going to figure out the average?

AUDIENCE: Take them all out and then divide by the number of them, right?

VINA NGUYEN: Kind of. Not really. So let's say you have K equals 0 times the probability. So let's say that your x equals 0 times by this probability, which is plus 1 times that probability. So how do I finish this off? 2, right? That's your last.

AUDIENCE: 3/4 cubed times--

VINA NGUYEN: Does everyone see this? So the general way of saying that is this is sum-- I know someone asked this in one of the sheets-- sum of all your possible x's times the probability of that x, which is essentially what we just did. x equals 0. Probability of x equals 0. x equals 1, probability of x equals 1. x equals 2, probability of x equals 2. Everyone understand that formula? OK.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Hm?

AUDIENCE: Oh, OK. Got it. Yeah. Never mind. What comes out isn't a probability.

VINA NGUYEN: Yeah.

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Right. So you have your real number line, 1, 2, and what you're expecting is that it falls right here. So this would be your [INAUDIBLE]. This is probably something like here. I don't know. You can graph it out later. Basically, your expected value is what the x is, not the probabilities. Does that make sense? Good question.

Does everyone understand expectation? OK. So variance is basically your expectation of this. And what this is basically saying is the difference between your actual number and the mean squared. So a graphical way of looking at this--

AUDIENCE: Why do you square it?

VINA NGUYEN: I'll get to that. This is not that graph. This is just like a random xy thing. So let's say you have a bunch of plots, whatever. So these are like your data points. And you figured out that this is the mean. So this is your expectation. This is what you expect to get. And x is each one of these.

So this is like a real value of x. And x minus your mean would be this distance. So x minus the mean. x minus the mean. And the reason we square this is that for here, this might be a positive number. This might be a negative number. So we want to get rid of all this confusion and just square it.

AUDIENCE: Doesn't that change the value? Couldn't you just do absolute values?

VINA NGUYEN: You could. But to get the standard dev is a lot easier if you can see mathematically where the square root's coming from. So standard dev would be just to get rid of that square, and then you can get the actual.

AUDIENCE: Right.

VINA NGUYEN: Absolutely. So that's graphically what variance is. So if your data was really way off, then you get huge distances. And then you square it and you get a bigger number. So that shows that your data is more varied. And if your data is really tight, then your distance is small. Yeah. And you square that. OK. Does that make sense?

So I haven't actually done this, but if you want you can-- if you want to calculate the variance of that you would just write out all of these using that PMF, because you have the mean. And just square everything. So because this expected value of this, your final formula is expected value of-- OK.

So that's your final formula for that-- what you expect to get when you calculate all of these. Now remember that since you have-- this is a random variable, but you need to do it for x equals, in this case, 0, 1, 2, et cetera, et cetera.

So I had problems-- oh, wait. Questions about any of this before I do the problems?

AUDIENCE: That's a distinct, right?

VINA NGUYEN: Hm?

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: Oh, OK. This is just your mean. This is the expected value of all of these distances squared. So it's not graphed on here. It would be like a number that you would find.

AUDIENCE: So you just add them together? [INAUDIBLE]

VINA NGUYEN: Yeah. So like you would for 0 or something, you would do-- and your mean is 3/2. Square it, and then you have a probability that you would get. And we have those probabilities. Probability of whatever that was.

And then you have to do the mean of this. So whatever you get for these, then you would times it by the probabilities and then get your expectation. So you're just converting your x's into this new random variable almost.

So you can kind of think of this as like y or something. Or R, or K, whatever number you want. So then you would just use that same formula. OK? Any other questions? Did the solve last week go over the problems? Do you have any questions? You want me to go over any specific one?

AUDIENCE: I didn't get the problems.

VINA NGUYEN: Oh, you didn't get them? Does anyone have a copy she could borrow or share with because I only have one. Nobody? It's OK. I'll just borrow it if I need it.

AUDIENCE: I can copy one.

VINA NGUYEN: Hm?

AUDIENCE: I can copy.

VINA NGUYEN: It's fine. Only if people have questions. Does anyone have questions about any of them? Do you want me to go over all of them?

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: What was the fourth one?

AUDIENCE: Something to do with a deck of cards.

VINA NGUYEN: Sorry. What was the fourth one? OK. Do you want me to go over that one?

AUDIENCE: Yeah.

VINA NGUYEN: OK. Can someone read it out because I don't have it.

AUDIENCE: Oh, basically, you had a [INAUDIBLE] cards into four hands. What's the probability of one ace in each hand?

VINA NGUYEN: OK. Does anyone need this? No. Good. I can say it again. This? So this was the last problem you did? So 52 cards. How many hands? How many hands was it? Four hands. And one ace per hand. So what was the main question, just how to do it, or--

AUDIENCE: No. Just what's the probability of one ace in each hand.

VINA NGUYEN: Oh, OK. So there are two ways, like we mentioned before. The easier way, in my opinion, is the sequential way. So let me read. OK. All right. So the first hand, we have 13 slots. 13 slots. Second hand has that many.

So if you were to put one ace here, that's just 52 different possibilities over the total number. So we can go to either one. Any one. It doesn't really matter.

AUDIENCE: Oh, right.

VINA NGUYEN: If you put one here, how do we calculate that? How many spots are left? 51, right?

AUDIENCE: 51. But you can only go to 39.

VINA NGUYEN: Yeah. Does everyone understand that? So you take out all of these as your possible choices, but they are still possible choices, but just not where the ace can go.

AUDIENCE: Then 26 out of 50.

VINA NGUYEN: Mhm. Does everyone understand that? First ace, second ace, third ace, fourth ace. So that's the intuitive way, in my opinion. The second way we can use, counting. We have partitions, because there's four different partitions. Stuff like that.

So you have your top part and your bottom part. So your bottom part is the total number of combinations. And this is only the ones that match your criteria. So we'll do this one first. Since we know partitions, we have the bottom part, which is the denominator.

Do you guys remember partitions? Kind of? Kind of? OK. So there should be two spots. And we have four hands, and we have 13 slots in each of those hands. So you partition that like this.

AUDIENCE: Oh, [INAUDIBLE]

VINA NGUYEN: OK. So this is the number of combinations you can have. 13 cards in four different ways. 52. First thing you want to count is how many different ways can you place those aces. And that's four ways because you ace one, ace two, ace three, ace four. You have four selections for your first slot. Then, once you put it in, you have 3, 2, 1. So that's four.

And then now you need to do this kind of thing. So what's your remaining partition left? 52 minus 4 cards is 48. So we only have this many left. And then the remaining slots?

AUDIENCE: [INAUDIBLE]

VINA NGUYEN: OK. Oh, it was kind of written differently. But basically, this goes on top, that goes on button. So you calculate it out and it looks like that.

Free Downloads

Free Streaming


Video


Caption

  • English-US (SRT)