Probability conditional expected value question

126 Views Asked by At

Let $X$ be the random variable receiving values in $ \mathbb{N} \cup \{0\}$.

Let $Y_i$ be independent, with equal distribution random variables that receive values in the same group.

Let $ Z = \sum_{i=1}^X Y_i$.

Prove that $\mathbb E[Z] = \mathbb E[X] \mathbb E[Y_1] $

I tried opening the right side, but I've no idea how $Z$ actually works since it always dependent of $X$.

2

There are 2 best solutions below

4
On

\begin{eqnarray} EZ &=& E[E[Z|X]] \\ &=& \sum_n E[Z| X=n]P[X=n] \\ &=& \sum_n n E[Y_1] P[X=n] \\ &=& \sum_n n P[X=n] (E[Y_1]) \\ &=& E X E Y_1 \end{eqnarray} Addendum:

@Did suggested that I elaborate one of the steps above: $E[\sum_{k=1}^X Y_k| X=n] = E[\sum_{k=1}^n Y_k | X=n] = E[\sum_{k=1}^n Y_k] = \sum_{k=1}^n E Y_k = n E Y_1$. The second equality follows from independence of $Y_k, X$ and the last because the $Y_k$ are identically distributed.

0
On

$$E[Z]=\sum_{x=1}^\infty E[Z|X=x]P(X=x)$$

But $Z|X=x$ is $\sum_{i=1}^{x}Y_i$, so $E[Z|X=x]=xE[Y_1]$ (as $Y_i$ are iid). Thus

$$\sum_{x=1}^\infty E[Z|X=x]P(X=x)=\sum_{x=1}^\infty xE[Y_1]P(X=x)=E[Y_1]\sum_{x=1}^\infty xP(X=x)=E[Y_1]E[X]$$