I am reading the Law of Total Expectation , and came across the following law $\mathbb{E}[Y|X]=\mathbb{E}[Y]$ where $X$ and $Y$ are independent random variables.
Now I have read the proof of the theorem and sort of understood it. But my question is that the very theorem seems odd because $\mathbb{E}[Y|X]$ is a random variable and $\mathbb{E}[Y]$ is a number.
So what the law says that a random variable is equal to a number. How does that work out ?
When $X$ and $Y$ are independent, it is true that $E[Y|X]=E[Y]$ although equality is almost surely (a.s), i.e., the set where it is not valid has probability $0$.
Here is the argument:
For simplicity assume $X$ is real valued. Suppose $A\in \sigma(X)$,then $A=X^{-1}(B)$ for some Borel set $B$. Then $$\begin{align} E[Y \mathbb{1}_A]&=E[Y\mathbb{1}_B(X)]=E[Y]E[\mathbb{1}_B(X)]\\ &=E[Y]\mathbb{P}(X\in B)=E[Y]\mathbb{P}[A] \end{align}$$ where the second identity I the first row above is due to the assumption of independence.
Since the constant map $\omega\mapsto E[Y]$ is $\sigma(X)$ measurable, it follows from the definition (and uniqueness) of conditional expectation that $E[Y|X]=E[X]$ $\mathbb{P}$-a.s.