Validity of the law $\mathbb{E}[Y|X]=\mathbb{E}[Y]$ where $X$ and $Y$ are independent random variables

63 Views Asked by At

I am reading the Law of Total Expectation , and came across the following law $\mathbb{E}[Y|X]=\mathbb{E}[Y]$ where $X$ and $Y$ are independent random variables.

Now I have read the proof of the theorem and sort of understood it. But my question is that the very theorem seems odd because $\mathbb{E}[Y|X]$ is a random variable and $\mathbb{E}[Y]$ is a number.

So what the law says that a random variable is equal to a number. How does that work out ?

2

There are 2 best solutions below

0
On BEST ANSWER

When $X$ and $Y$ are independent, it is true that $E[Y|X]=E[Y]$ although equality is almost surely (a.s), i.e., the set where it is not valid has probability $0$.

Here is the argument:

For simplicity assume $X$ is real valued. Suppose $A\in \sigma(X)$,then $A=X^{-1}(B)$ for some Borel set $B$. Then $$\begin{align} E[Y \mathbb{1}_A]&=E[Y\mathbb{1}_B(X)]=E[Y]E[\mathbb{1}_B(X)]\\ &=E[Y]\mathbb{P}(X\in B)=E[Y]\mathbb{P}[A] \end{align}$$ where the second identity I the first row above is due to the assumption of independence.

Since the constant map $\omega\mapsto E[Y]$ is $\sigma(X)$ measurable, it follows from the definition (and uniqueness) of conditional expectation that $E[Y|X]=E[X]$ $\mathbb{P}$-a.s.

0
On

The idea of '$\mathbb{E}[X]$' being a constant is simple.

Alternatively one could (which I think is an excellent way to think about random variables) define $\mathbb{E}[X] : = \mathbb{E}[X | \{ \emptyset,\Omega \}]$ . I.E the conditional expectation on the trivial sigma algebra

Notice the RHS is $\{\emptyset , \Omega\}$ measurable and so is a constant. Likewise the LHS is a constant.

Think of $\mathbb{E}[X | \mathcal{A}]$ as "the best guess of $X$ given we understand $\mathcal{A}$" and so intuitively if $X$ and $Y$ are independent your 'best guess' of $X$ doesn't change at all and you are stuck where you were before simply guessing on the trivial sigma algebra.