How to show $E[X|Y=y]=\frac{E[X1_{Y=y}]}{P(Y = y)}$ if $P(Y = y) > 0$?

123 Views Asked by At

Here $X$ and $Y$ are real-valued random variables, $X$ is integrable.

We know any version of $E[X|Y]$ can be written as a function of $Y$, say $f(Y)$. I can prove when $P(Y = y) > 0$, $f(y)$ is the same for different versions of $f$. Today I see a claim that $f(y)$ is just $\frac{E[X1_{Y=y}]}{P(Y = y)}$ in such scenario, this looks intuitively correct, but how to prove it?

3

There are 3 best solutions below

1
On BEST ANSWER

By the defintion of conditional expectation $f(Y)=E(X|Y)$ implies that $E[f(Y)1_{Y^{-1}(A)}]=E[X1_{Y^{-1}(A)}]$ for every Borel set $A$. Taking $A=\{y\}$ we get $E[f(Y)1_{Y=y}]=E[X1_{Y=y}]$. But $E[f(Y)1_{Y=y}]=E[f(y)1_{Y=y}] =f(y)P(Y=y)$. So $E[X1_{Y^{-1}(A)}]=f(y)P(Y=y)$. Divide by $P(Y=y)$ to finish.

3
On

This is not too hard and comes out quite cleanly. We have the formula for expectation. \begin{equation} E[X|Y=y]= \sum_x xP(X=x|Y=y) \end{equation} We use Bayes Rule directly and factorize out the $P(Y=y)$ term. \begin{equation} E[X|Y=y]= \sum_x xP(X=x|Y=y) = \sum_x x\frac{P(X=x,Y=y)}{P(Y=y)} =\frac{\left(\sum_x xP(X=x,Y=y)\right)}{P(Y=y)} \,. \end{equation} We will show $E[X \cdot 1_{\{Y=y\}}] = \sum_x x P(X=x, Y=y)$ \begin{equation} E[X \cdot 1_{\{Y=y\}}] = \sum_x \sum_y x \cdot 1_{\{Y=y\}} P(X=x, Y=y) \end{equation}

Since the indicator function $1_{\{Y=y\}}$ is equal to 1 when $Y = y$ and 0 otherwise, the double sum reduces to a single sum:

\begin{equation} E[X \cdot 1_{\{Y=y\}}] = \sum_x x P(X=x, Y=y) \end{equation}

Thus, \begin{equation} E[X|Y=y]= \frac{\sum_x xP(X=x,Y=y)}{P(Y=y)} =\frac{E[X \cdot 1_{\{Y=y\}}]}{P(Y=y)} \,. \end{equation}

Which is our desired result.

0
On

$E\big[X\mid Y=y\big]E\big[I_{Y=y}\big]=E\Big[E\big[X\mid Y=y\big]I_{Y=y}\Big]=E\Big[E\big[X\mid Y\big]I_{Y=y}\Big]=E\big[XI_{Y=y}\big]$