Intuitive Understanding Of Expected Value In Probability

322 Views Asked by At

I don't understand the concept of expected value in probability (https://en.wikipedia.org/wiki/Expected_value). I don't understand why one should multiply the value of an event, to the probability of it occurring, to find the expected value, and what this expected value means.

My guess is that it means the average value of an event. For instance, if you get 30 dollars from rolling a 1 on a die, and there is a 1/6 chance of rolling a one, then in 6 games, the average amount you will gain per game is $5.

But then, this leads to some things that don't make sense. Namely, what would the expected value of playing 1 game be? It can't be $5, because that's the average of 6 games! If you only play 1 game, that wouldn't make any sense. Same applies for playing any number of games that is not a multiple of 6. How would it make sense to have 5 dollars your expected value, if that represents the average result of playing 6 games?

Can someone please explain, thank you!

2

There are 2 best solutions below

0
On

Namely, what would the expected value of playing 1 game be? It can't be $5, because that's the average of 6 games! If you only play 1 game, that wouldn't make any sense. Same applies for playing any number of games that is not a multiple of 6. How would it make sense to have 5 dollars your expected value, if that represents the average result of playing 6 games?

If we were to think of it as we receive exactly \$30 every multiple of 6 games, then this would be pretty misleading in my opinion. For example, if you put in \$50 to roll the dice 11 times, by your definition, you would be expected to lose \$20 (since you get \$30 every multiple of 6). But all the sudden, if you roll the dice the 12th time, you are now winning. What makes that 12th roll (and the 18th, 24th, etc.) specifically special? Nothing, since they aren’t! So it’s misleading.

The key here is that thinking of it like in the discrete steps where every 6 games you receive a certain payout every $n$ rolls, it's misleading since the there's nothing special about the $n$th roll. The probability that you roll a 1 on your first try vs. your sixth is the same. So the expected payout is the same.

But yes, I see where you're coming from on the other hand. Consider the lottery. If you have a 1 in a million chance of winning the lottery, and the jackpot is 1 million dollars, then your expected earnings is \$1. But of course, you're not going to be handed any money unless you win it all. So what's the deal? I like to think of it as: does it come down to being worth it statistically? If I spend 2 dollars on this lotto ticket, is that a wise investment? A statistics-oriented person would say no, precisely because of the classic expected value definition.

So in summary, expected values are averages. Another way to think of it is like this: say you're driving a car for two hours. The first hour, you are going a slow 10 mph. The next hour, you pick up the pace and go 60 mph. You have gone 70 miles in 2 hours. Your average speed is 35 mph. So you're right, it's not really correct to say "oh, this means I went 35 miles in the first hour", since I only went 10 miles. But knowing that I was traveling for 2 hours, I am correct in saying that you went a total of 70 miles. Similar idea.

0
On

Some recap of definitions:

The expected value is a property of a random variable, also known as its mean. It gives us info about the distribution of the variable.

The expected value of a discrete random variable is the probability-weighted average of all possible events, while for a continuous random variable it is the integral of the variable weighted by its pdf.

A sample is a realization of a collection of identically distributed random variables. The law of large numbers guarantees that the average or arithmetic mean of a large number of these realizations or trials is close to the expected value. Notice here the "large number" qualifier, which allows this to work even though the weighting is not there.

Example:

You consider betting on a game where you win $\$10,000$ with a probability of $0.99,$ but lose $\$100,000$ with a probability of $0.01.$ What is the expected payoff:

$$\mathsf E[\text{payoff game}] = 0.99\times \$10,000+0.01\times(-\$100,000)=\$8,900$$

With this expected payoff, who would be so foolish as to not to bet? Well, most likely every other average Joe working for a living, who wouldn't be able to afford losing once (a capturing state or bankruptcy). Could this mean that the payoff calculation is wrong? Not at all. In fact, this would be a terrific opportunity to make money for someone (or a company) with deep pockets, for whom ruin from losing a few hundred grand is not a concern.

So the expected value - which can be estimated by means of the sample mean when we don't have the probabilities of every outcome - expresses a long-run tendency. I presume this can clarify that the expectation in your post of $\$5$ is the result of

$$\small\mathsf E[\text{payoff die}]=\frac{1}{6}\$30+\frac{1}{6}0+\frac{1}{6}0+\frac{1}{6}0+\frac{1}{6}0+\frac{1}{6}0=\$5$$

and it is correct, but has nothing to do with the probability of winning in one single die toss, i.e. $\frac{1}{6},$ which would carry with it a price of $\$30,$ or the probability of losing, i.e. $\frac{5}{6}.$