Variance when playing a game with a fair coin

3.8k Views Asked by At

I am having a hard time with this question for some reason.

You and a friend play a game where you each toss a balanced coin. If the upper faces on the coins are both tails, you win \$1; if the faces are both heads, you win \$2; if the coins do not match (one shows head and the other tail), you lose \$1. Calculate the expected value and standard deviation for your total winnings from this game if you play 50 times.

PMF Values: \begin{array}{c|c} $& p\\\hline +$1 & .25\\ +$2 & .25\\ -$1 & .50 \end{array}

I have calculated the expectation as $$1(.25)+2(.25)+(-1)(.5) = .25,$$ so $$E(50X) = 50\cdot.25 = \$12.5,$$ which I have confirmed is correct.

I know I need to get $\operatorname{Var}(50X)$, but doing a standard variance calculation and then using the formula $a^2\operatorname{Var}(X)$ is not giving me the correct value.

What step am I missing?

3

There are 3 best solutions below

0
On BEST ANSWER

Variance is the mean of the squares minus the square of the mean. $$ 0.25\cdot1^2+0.25\cdot2^2+0.5\cdot(-1)^2-0.25^2=1.6875 $$ For independent events, the variance of a sum is the sum of the variances, so the variance for $50$ events is $$ 50\cdot1.6875=84.375 $$

2
On

$Variance= 50 Var(X) = 50.[{.25(1-.25)^2 + .25(2-.25)^2 +.5*(-1-.25)^2}]=84.375$

0
On

You are confusing the distribution of $50X_1$ and $\sum_{k=1}^{50}X_k$ when ${(X_k)}_{k=1}^n$ is a sequence of independent and identically distributed random variables.

It is true that $\mathsf E(50X_1)=50\mathsf E(X_1)$ and $\mathsf{Var}(50X_1)=2500\mathsf {Var}(X_1)$.   However, that is not what you are dealing with.


Due to the Linearity of Expectations, the expectation of the series is the series of expectations.   Its because the distributions are identical that this series is equal to $50$ times an individual expectation .

$$\begin{align}\mathsf E(\sum_{k=1}^{50}X_k) ~&=~ \sum_{k=1}^{50}\mathsf E(X_k) & \text{Linearity of Expectation} \\[1ex] & =~ 50\,\mathsf E(X_1) & \text{Indentical Distributions}\end{align}$$

Similar result, different reasoning.

(Note: We have not use independence at thi point.)


The distinction becomes apparent in dealing with the variance.

When it comes the the variance of the series, we have to employ the Bilinearity of Covariance.

$$\begin{align}\mathsf {Var}(\sum_{k=1}^{50} X_k) ~&=~ \mathsf {Cov}(\sum_{k=1}^{50}X_k,\sum_{j=1}^{50}X_j) \\ &=~ \sum_{k=1}^{50}\sum_{j=1}^{50}\mathsf{Cov}(X_k,X_j) &&\text{Bilinearity of Covariance} \\ &=~ \sum_{k=1}^{50}\mathsf {Cov}(X_k,X_k) ~+~ 0 &&\text{Independence: }\mathsf{Cov}(X_j,X_k)=0\text{ when }j\neq k \\[1ex] &=~ 50\mathsf {Var}(X_1) && \text{Identical distributions} \end{align}$$