Expected Value, Variance, and Covariance of Card Game

267 Views Asked by At

I am attempting the following problem and was hoping to get some guidance/tips based off of what I have so far:

You toss a card that is black on one side and red on the other. The probability of it landing on black, P(B) = 1/2 and the probability of it landing on red, P(R) = 1/2. With that in mind, we can play the following two games:

  • Game I: Win 1.00 if it lands on black, and lose 0.50 if it lands on red.

  • Game II: Win 5.00 if it lands on black, and lose 6.00 if it lands on red.


Suppose I and II are random variables that represent your earnings when playing games I and II. Find the following:

  • E(I) \begin{align*} E(I)=\frac{1}{2}(1.00)+\frac{1}{2}(-0.50)=0.25\\ \end{align*}

  • E(II) \begin{align*} E(II)=\frac{1}{2}(5.00)+\frac{1}{2}(-6.00)=-0.50\\ \end{align*}

  • E(I + II) \begin{align*} E(I + II)= E(I) + E(II) = 0.25 + (-0.50) = -0.25\\ ; linearity (?) \end{align*}


Then find the following:

  • var(I) \begin{align*} var(I)= E[(I-\mu)^2] = (1.00-0.25)^2(\frac{1}{2})+(-0.50-0.25)^2(\frac{1}{2}) = 0.5625\\ \end{align*}

  • var(II) \begin{align*} var(II)= E[(II-\mu)^2] = (5.00-(-0.50))^2(\frac{1}{2})+(-6.00-(0.50))^2(\frac{1}{2}) = 30.25\\ \end{align*}

  • cov(I,II) \begin{align*} cov(I,II)= E(I*II)-E(I)E(II) = E(I*II)-(0.25)(0.50); E(I*II)=(?) \\ \end{align*}

  • var(I + II) \begin{align*} var(I+II)= ? \\ \end{align*}


Based on what was computed above, answer the following:

  1. Is a single game of game I or game II riskier?; A single game of game II because E(II) < E(I).

  2. Which single game is the most profitable in the long run, game I or game II?; A single game of game I for the same reason as above.

  3. Suppose you have an infinite amount of money and you can play game I or game II as many times as you want. What is the most profitable strategy (only play game I, only play game II, play both game I and game II but game I two times more frequently than game II, etc.)
2

There are 2 best solutions below

1
On BEST ANSWER

First of all, yes, $\mathsf{E}[\cdot]$ is a linear operator, so your justification for $\mathsf{E}[I+II]$ holds.

For $\mathsf{E}[I*II],$ let $X=I$ and $Y=II$. Then $I*II = XY$.

$$\mathsf{E}[XY] = \sum_kk\cdot P(XY=k)$$

$XY$ has four possible outcomes, each with probability $\frac14$:$-6,-2.5,3,5.$ So

$$\mathsf{E}[XY]=\frac14(-6-2.5+3+5)=-\frac18 \\ \therefore \mathcal{Cov}(I,II)=\mathcal{Cov}(X,Y)=\mathsf{E}[XY]-\mathsf{E}[X]\mathsf{E}[Y]=-\frac18-(0.25)(-0.5)=0$$

This result should not be surprising because games $I$ and $II$ are clearly independent. Then

$$\mathcal{Var}(X+Y)=\mathcal{Var}(X)+\mathcal{Var}(Y)+0=30.8125$$

$1.$ This is sort of an odd question: what is meant by "riskier"? More likely to lose money? Or has more uncertainty? In either case, game $II$ is riskier, but the reasoning is either because of the difference in expectations or the differences in variances, depending on what you think your teacher means.

$2.$ You're right on this one

$3.$ I agree with Nameless' answer, but for a little more reasoning consider this system of equations: $\mathsf{E}[aX+bY]$ and $a+b=1$. Then if we maximize $\mathsf{E}$ according to $a$ and $b$, we will have found the optimal long-run proportion of $X$ and $Y$ games that should be played.

$$\mathsf{E}[aX+bY]=a\mathsf{E}[X]+b\mathsf{E}[Y]=\frac14a-\frac12b$$ Since $a+b=1$,

$$\frac14a-\frac12b=\frac14-\frac34b \\ \therefore \cfrac{d}{db}\mathsf{E}[aX+bY] = -\frac34$$

$\therefore \ $ As the proportion of $II$ games played increases, $\mathsf{E}$ decreases. Thus no $II$ games played is the optimal strategy.

0
On
  1. I, not for your reason, but because the variance is lower. The variance of the payoff is usually considered a risk measure; the larger the variance the riskier the random variable.

  2. I because of the higher expected value, correct.

  3. Only I, by the law of large numbers, which says that the long run average converges to the expected value of the random variable. You expect to gain 0.25 every time you play it, while you expect to lose money every time you play II. Hence, you should be playing I every time.