Expectation for probability distribution?

235 Views Asked by At

Today, I had posted this question about the "Payoff of a dice game" and in the answers, most had explained by associating the problem with the concept of the expected value of a probability distribution. So, I caught up on that but I find that I'm still confused.

With regard to the probability distribution curve, I fully understand the reason why the sum of product of the random variable and the probability gives the expected value for a given experiment. However, I wish to obtain a deeper and more intuitive understanding of this concept. I watched Sal Khan's videos and he explained it by using the following example:

"Imagine that you are playing basketball. The probability that you will make a basket is 40%. If you throw $n$ baskets, the expected value is given by $np$, where $p$ is the probability of success, defined here as the event of making a basket. Think of it this way; for every basket that you throw, the ball is falling 40% into the hoop."

Although this is a clever way to think about this, I was wondering if anybody else had any more insights. How do you make sense of this concept in your head? Please share for it would help me tremendously.

Edit: I realize that I have not been very clear while stating the question and want to clarify that I understand the logic behind the other question that I had posted. The reason that I posted this question separately is because I wanted to know how other people made sense of these concepts of averages and expectation values inside their head. What is the most useful way to think about these concepts?

Thanks ever so much :) Regards.

3

There are 3 best solutions below

3
On

Assume a random variable $X$ that denotes the number of heads when you toss two coins together. $X$ has three possible values; $0, 1, 2.$ The probability of getting a $0$ (that is, no heads) is the probability of getting tails on one coin and getting a tails on the second coin. This is $0.25$.

The probability of getting $X$ as $1$ is the probability of getting a heads on the first coin, and the probability of getting a tails on the second coin, OR, vice versa (tails on first coin, heads on the second coin). This is $0.25+0.25=0.5$.

Similarly, the probability of $X$ assuming the value $2$ is when you get $2$ heads (that is, $0.25$).

Now you ask yourself, "what result should I expect when I toss two coins, one after the other? Am I most likely to get two tails, two heads, or one head and one tail?"

To answer this, you multiply each value of $X$ with its corresponding probability. Why do we do this? Well, let's look at the definition of probability. Probability can be defined as the frequency of an event happening, divided by the sum of the frequencies of all events happening. That is $$\frac{frequency\ of\ event\ A}{sum\ of\ frequencies\ of\ events\ A,\ B,\ C...}$$

The sum of all frequencies happens to be the entire sample space. Now let's look at the definition of the "mean". If we are given a frequency table with values A, B and C and their corresponding frequencies (freqA, freqB, freqC), the mean is $$\frac{(A)(freqA)+(B)(freqB)+(C)(freqC)}{freqA +freqB +freqC}$$ or $$\frac{(A)(freqA)}{freqA +freqB +freqC}+\frac{(B)(freqB)}{freqA +freqB +freqC}+\frac{(B)(freqB)}{freqA +freqB +freqC}$$ This is nothing but $$(A)(ProbA)+(B)(ProbB)+(C)(ProbC)$$ (where ProbA is the probability of A).

So we see that the mean value (or expectation value) is defined as above.

In the case of my example, the expected value is 1. This means that if I toss 2 coins a very large number of times (in an unbiased way), the most common outcome would be 1 heads and 1 tails, (not necessarily in that order). Calculating the mean or "expectation" value before hand means that I can safely predict that for a large set of two-coin-tossing, I will get 1 heads and 1 tails on average.

1
On

I don't know if I fully understand the question. Are you asking an intuitive explanation of why expectation of sum of random variables is the sum of expectations? We can break down the game into 6 different events, $A_i$ is the event "The dice value is $i$". Now we define the paying rule. $X$ is a function which gives value to each event. Intuitively $X$ is the game manager and he takes or gives you money based on the outcome of the dice. In the table you already have defined $X$, $$X(A_i)=M_i$$

Expectation can be seen as a weighted sum of the values of a random variable, where the weights are the probabilities and the summands are the values of the the random variable. If your expectation is negative that means that you will lose money in the average case, and if it's positive you will win money on the average case. As if you asking what will the game manager give you at the end of the game. Hope it helped you.

4
On

You have said in the comments that you don't quite grasp the idea behind averages, so I'll try to respond to that.

The average is good for making guesses about large sets of data. If there were 2000 people in a hall and you wanted to know how much do they weigh together, you could either visit them one by one, asking them to tell their weight, and sum the numbers up. (That looks pretty tedious.) Or, you may ask only 100 of them and take the average of their weights. Then, you multiply it by 2000 to get your guess. So, if you add up weights of 2000 people, there's a good chance that the results will be quite similar to the case where every of them has the average weight.

However, it would be a mistake to draw any significant connections between the average and the weight of any single person. Let's say that the average is 80 kg. In the group of 2000, there may be little children weighing 20 kg, very fat persons weighing 150 kg, and most of the people anywhere in between. It can easily be more likely that you encounter people whose weight isn't anything near to 80 kg. But, if you come up to a group of 10, there is a good chance their total weight will be around 800 kg.

The expectancy of a random value follows the same idea. For a six-sided die, the expected value to show up is 3.5. If you roll 1000 times, you can be pretty confident that the sum will be somewhere around 3500. (If you want to have this formulated more precisely, it may be worth to look up some info on normal distribution.) But the 3.5 figure doesn't tell anything about result of a single roll. That may still be anything from 1 to 6.

And that's the logic behind the "expected payoff of -1/6" of your die game in the previous question. It doesn't say that if you play one round, you lose 1/6 of a coin. It only says that if you play, say, 1200 rounds, the results will be quite similar to the case when you lose 1/6 of a coin each round, in other words, you can count with losing 200 coins. Nothing less, nothing more. So if somebody offers you 1200 rounds of this game, it's not worth accepting, but if you play 10 rounds, the results may be various and you may even very well win.

Hope that will be helpful, and that others will bear with my not-very-mathematical approach.