Calculating an average on normal distribution

3.5k Views Asked by At

Given the fair dice, if the result is $1$ or $2$ the profit is $3$USD, if the result is $6$ you don't win or lose anything, for every other result you lose $2$USD.

What is the average profit, that only in $10$% of the days, that we will participate in the game $125$ times, we will earn in average less than it.

I've started by understanding that I'm going to use the average on normal distribution, since we play the game for $125$ times the average will get close to the normal distribution graph.

I let $X$ be the profit of a single game, so I have calculated $E(X)$ and $V(X)$ using the following table: $$\begin{array}{|c|c|c|c|} \hline X& -2 & 0 & 3 \\ \hline p(x)& \frac{3}{6}& \frac{1}{6}&\frac{2}{6}\\ \hline \end{array}$$

And got to: $E(x) = 0$ and $V(x) = 5$

Now I wonder how do I calculate the required?

1

There are 1 best solutions below

3
On BEST ANSWER

Ok so for this type of problem we are going to use the Central Limit Theorem. The Central Limit Theorem tells us that for a large sample the distribution of the sample means approaches the normal distribution. If we play the game 125 times each day then we see that the mean (average) profit from each day is approximately normal with: $$ E(x)=0, V(x) = 5/125, \sigma(x) = \sqrt(5/125) $$

From here you can take the inverse of the Normal Distribution to get back a Z-score and apply it to the above.