Just looking for an explanation of how the conditional expectation "$E[X|Y]$" of any two random variables $X$ and $Y$ would change if we included the condition that $X$ and $Y$ be independent.
Normally:
- $E[X|Y] = \sum_xx(P_{X|A}(x|y))$ where $P$ stands for the PMF of $X|A$.
If $X$ and $Y$ are independent, could we just apply the property of independent events
- $P(A|B) = P(A)$
to get that $E[X|Y] = \sum_x x\,P_X(x) = E[X]$?
Thank you!
If $X$ and $Y$ are independent then $E[X|Y]=E[X]$. In the case where $X$ and $Y$ are both discrete, you can relate this back to events through the identity $P(A|B)=P(A)$ for independent events $A,B$, where $P(B)>0$.