$E[Y|X]$ and $var[Y|X]$

88 Views Asked by At

Choose a number $X$ from the set ${(1,2,3)}$ uniformly at random. Then toss a fair coin $X$ times and with $Y$ denote the number of heads you see.

(i) Compute $E[Y|X]$ and var$[Y|X]$
(ii) Compute $E[Y]$ and var$[Y]$

(i) $$P(X=1)=\frac{1}{3},P(X=2)=\frac{1}{3},P(X=3)=\frac{1}{3}$$ Thus $E[X]=2$ and var$[X]=\frac{2}{3}.$

Using this, we can compute $E[Y|X=1],E[Y|X=2],E[Y|X=3]$. Comptuting this yields, $\frac{1}{2},1$ and $\frac{3}{2} $ respectively.

$$E[Y|X]=\frac{1}{2}X$$ $$var[Y|X]=E[X^2]-E[X]^2=\frac{1}{2}X-\frac{1}{4}X^2$$ (ii) $$E[Y]=E[E[Y|X]]=E[\frac{1}{2}X]=\frac{1}{2}E[X]=\frac{1}{2}(2)=1$$ $$var(Y)=var[E[Y|X]]+E[var[Y|X]]=var(\frac{1}{2}X)+E(\frac{1}{2}X-\frac{1}{4}X^2)=\frac{1}{4}var(X)+\frac{1}{2}E(X)-\frac{1}{4}E(X^2)=1/4(2/3)+1/2(2)-(1/4)(14/3)=0 $$

2

There are 2 best solutions below

8
On BEST ANSWER

By computing $E(Y|X=1)=1/2,$ $E(Y|X=2)=1,$ etc you have computed $E(Y|X).$ You can write the answer more succinctly as $E(Y|X) = X/2.$

You don't necessarily need any information on the distribution of $X$ to compute $E(Y|X)$ or $\mathrm{Var}(Y|X)$ You are conditioning on $X,$ i.e. assuming $X$ is known. However, it will be important when you compute $E(Y)$ and $\mathrm{Var}(Y).$

So you can compute the conditional variance in the same spirit as you got the conditional expectation. Assume $X$ is known and compute what the variance of $Y$ would be. Since the variance of the number of heads from $X$ coin flips is $X/4$, you have $\mathrm{Var}(Y|X) = X/4.$

You are correct that in order to get $E(Y)$ and $\mathrm{Var}(Y)$ you need to apply the law of total expectation and total variance.

5
On

We know that the conditional distribution of $Y$ given $X$ is binomial, with trial amount $X$ and success rate $1/2$. $$Y\mid X ~\sim~\mathcal{Bin}(X,1/2)$$

This tells us the conditional expectation and variance.$$\mathsf E(Y\mid X) ~=~ X\cdot\tfrac 12 \qquad\quad\\\mathsf {Var}(Y\mid X) ~=~ X\cdot \tfrac 12\cdot (1-\tfrac 12)$$

(The result is well known, however, if you wish to you may arrive at this from the fact that $Y$ is the sum of successes in $X$ from a series of independent Bernoulli trials $\{Y_k\}$, with identical success rate $1/2$.   Then the Linearity of Expectation and Bilinearity of Variance state: $\mathsf E(Y\mid X) ~=~ \sum_{k=1}^X \mathsf E(Y_k)$ and $\mathsf {Var}(Y\mid X)~=~\sum_{k=1}^X\mathsf {Var}(Y_k)$.)

Then we may use the Laws of Total Expectation and Variance:$$\begin{align}\mathsf E(Y)~&=~\mathsf E(\mathsf E(Y\mid X)) \\[1ex] &=~ \tfrac 12\mathsf E(X)\\[1ex] &~~\vdots\\[5ex]\mathsf{Var}(Y)~&=~\mathsf E(\mathsf{Var}(Y\mid X))+\mathsf {Var}(\mathsf E(Y\mid X))\\[1ex] &=~\tfrac 14 \mathsf E(X)+\tfrac 12^2~\mathsf{Var}(X)\\[1ex] &~~\vdots\end{align}$$

Since you know that $X$ is uniform discrete over the support $\{1,2,3\}$ you may complete.