Let $X$ be a random variable with conditional expectation $\mathbb{E}[X\mid \Sigma]=0$ for some $\Sigma$. Is $X$ independent of $\Sigma$?

142 Views Asked by At

To be more precise, fix a probability space $(\Omega,\mathcal{F},\mathbb{P})$, and let $X:\Omega\to\mathbb{R}$ be an $L^1$ random variable. Suppose $\Sigma\subset\mathcal{F}$ is such that $\mathbb{E}[X\mid\Sigma]=0$, then is $X$ independent of $\Sigma$?

This is true in the trivial cases where $\Sigma=\{\emptyset,\Omega\}$ or $\mathcal{F}$.

2

There are 2 best solutions below

7
On BEST ANSWER

Let $X\sim N(0,1)$ and $Y\ge 1$ a.s. is any r.v. independent of $X$. Then $$ \mathsf{E}[(X/Y)\mid \sigma(Y)]=0=\mathsf{E}[X/Y] \quad\text{a.s.} $$ but $(X/Y)$ is not independent of $\sigma(Y)$.

0
On

Consider the example I gave here:

Flip a fair coin to determine the amount of your bet: if heads, you bet \$1, if tails you bet \$2. Then flip again: if heads, you win the amount of your bet, if tails, you lose it. (For example, if you flip heads and then tails, you lose \$1; if you flip tails and then heads you win \$2.) Let $X$ be the amount you bet, and let $Y$ be your net winnings (negative if you lost).

Then $X,Y$ are not independent; for instance we have $P(X=2) = 1/2$, $P(Y=1)=1/4$, and $P(X=2, Y=1) = 0$. But taking $\Sigma = \sigma(X)$ we have $E[Y \mid \Sigma] = E[Y \mid X] = 0$. You can see that $P(Y=1 \mid X=1) = P(Y=-1 \mid X=1) = 1/2$, so $$E[Y \mid X=1] = 1 \cdot \frac{1}{2} + (-1) \cdot \frac{1}{2} = 0.$$ Likewise $E[Y \mid X=2] = 0$. The idea is that the value of $X$ does affect the possible values for $Y$, but no matter how the first flip came up, you are still making a (conditionally) fair bet.

This is similar in spirit to the example given by d.k.o. but perhaps a little more elementary.