Example $E[X|Y] = E[X]$ for $X, Y$ dependent

224 Views Asked by At

So I need a counter example to show that $E[X|Y] = E[X]$ does not necessarily imply independence. My thought:

If $X~\sim\mathcal U(-1,1)$ and $Y=-X$, then $E[X|Y] = \int_\Omega X(\omega)P^{X|Y}(d\omega)$, but $P^{X|Y}:= \frac{P(X\cap Y)}{P(Y)}$ and, as $X$ and $Y$ never coexist, $P(X\cap Y)=0$, giving the desired result $E[X|Y] = E[X]=0$.

Is this correct? I saw an example elsewhere (after I formulated this one) that uses $Y=X^2$. But if both these work, wouldn't this then be true for any $Y=f(X)$ that is disjoint from $X$?

4

There are 4 best solutions below

2
On BEST ANSWER

Your example does not work. $E[X|-X]$ is $X$ itself since $X$ is already meaurable w.r.t. $\sigma (-X)\equiv \sigma (X)$.

0
On

Take $Y$ to be some positive random variable, $U$ a uniform random variable on $[0,1]$ and $Z$ a Rademacher random variable with parameter $\frac{1}{2}$. Assume that those three random variables are independent. It is known and easy to show that $$-\frac{\log(1-U)}{\lambda}$$ for some positive real number $\lambda$ is an exponential random variable with parameter $\lambda$.

Now, set $$X=Z\frac{\log(1-U)}{Y}$$. Thus, $$\mathbb{E}\left[X\mid Y\right]=\mathbb{E}\left[Z\frac{\log(1-U)}{Y}\mid Y\right]=\frac{1}{Y}\mathbb{E}\left[Z\log(1-U)\mid Y\right].$$ But as $Z$ and $U$ are independent of $Y$, we have $$\mathbb{E}\left[Z\log(1-U)\mid Y\right]=\mathbb{E}\left[Z\log(1-U)\right]=0.$$ In addition, $\mathbb{E}[X]=\mathbb{E}\left\{\mathbb{E}\left[X\mid Y\right]\right\}=0$.

Of course $X$ and $Y$ are obviously dependent.

My construction might seem a bit elaborate but I wanted it explicit. However, the idea is very simple. Consider a random variable $X$ whose distribution depends of some parameters, say $a$ and $b$. And assume that the expectation of $X$ does not depends on $a$. So, if the parameter $a$ is in fact random, so the conditional expectation of $X$ knowing $a$ does not depend on $a$.

In the exemple above, I choose $X$ to be a Laplace distribution (conditionnaly on $Y$) with random variance (which depends of $Y$) but the expectation of such distitribution does not depend on the variance.

0
On

I don't think your example works... Since if you know the value of $Y$ it is clear which value $X$ will have.

On the other hand, the squared example works because there are two different possible values, one opposed to the other, with the same probability.

Edit

In either case, be careful because these random vectors $(X,Y)$ are not "continuous" (the are no the usual integral of a function)

You can contruct an example simpler than the one presented by @hamms.

Consider $Y\sim U([0,1])$ a uniform variable. Then you can consider $X|Y=y_0 \sim U([-y_0,y_0])$. Obviously $E[X|Y] = 0$ (as a function of $y_0$).

The joint ditribution is null everywhere except in the triangle bounded by the lines $y = 0$, $x = y$ and $x=-y$, where it is f_{XY}(x,y) = \frac1{2y}. So just compute the marginal density integrating accordingly $$ f_X(x) = \int f_{XY}(x,y)\,\mathrm{d}y = \int_{|x|}^1 \frac1{2y}\,\mathrm{d}y = -\frac12 \ln |x| $$ Now compute the expectation for $X$. By symmetry it is $0$. Be can be more precise (and boring!): $$ \begin{align*} E[X] &= \int_{-1}^1 x f_X(x) \,\mathrm{d}x = \int_0^1 x f_X(x) \,\mathrm{d}x + \int_{-1}^0 x f_X(x) \,\mathrm{d}x \\ &= \int_0^1 x f_X(x) \,\mathrm{d}x - \int_{-1}^0 |x| f_X(x) \,\mathrm{d}x = \int_0^1 x f_X(x) \,\mathrm{d}x - \int_0^1 x f_X(x) \,\mathrm{d}x \\ &=0, \end{align*} $$ since $f_X(x) = f_X(-x)$.

for $x\in[-1,1]$.

0
On

Consider the following contingency table

enter image description here

As you can verify

$$\mathbb{E}[X]=0$$

and so is

$$\mathbb{E}[X|Y]=0$$

being

$$\mathbb{E}[X|Y=0]=0\times1$$

$$\mathbb{E}[X|Y=1]=-1\times 0.5+1\times0.5$$

But clearly $X,Y$ are not independent (there are null cells in the contingency table)

In this situation, $X,Y$ are Regressive independent because the regression function $E(X|Y)$ is constant (and obviously equal to $E(X)$). This, as shown in the counterexample, does not imply that $X\perp\!\!\!\perp Y$