Is stating $\operatorname{E}(x) = 0, \operatorname{cov}(y,x)=0$, the same as stating $\operatorname E(y\,|\,x)=0$?

289 Views Asked by At

$\newcommand{\mo}[1]{\operatorname{#1}}$ I am trying to get a better understanding of the inner workings of the expectations operator. First define that we have a random vector $y$ and $x$.

How to prove that:

The condition $$\mo E(y\,|\,x)=0$$

is the same as the two conditions combined: $$\mo{cov}(y,x)=0 \,\land\, \bigl(\mo E(y)=0 \,\lor\, \mo E(x)=0\bigr)$$

What's the best way to also prove that both of these conditions can be reduced to $\text E(u)=0$, if $X$ is deterministic?

3

There are 3 best solutions below

1
On BEST ANSWER

To complement Robert Israel's answer.

The data $E[Y|X]=0$ is much more informative than {$Cov(X,Y)=0$, $E[X]=0$, $E[Y]=0$}. The later are just three numbers, the former is (in general) a function: the regression curve.

{$Cov(X,Y)=0$, $E[X]=0$, $E[Y]=0$} only tells you that the variables are neither positively nor negatively correlated. But you can think of many examples in which this happens (for example, think of a uniform distribution over some 2D shape that is symmetric along the $Y$ axis; say, an isosceles triangle with horizontal base; just move it up-down till both means are zero) but still the regression line is not the trivial horizontal line (draw it!)

In words, {$Cov(X,Y)=0$} tells you roughly this: to know that $X$ is bigger than expected does not tell me if $Y$ tends to be bigger or smaller than expected. On the other side, $E(Y \mid X)=E(Y)$ tells you this: to know the value of $X$ does not alter the expected value of $Y$, for any value of $X$. The later is much more informative.

What it's true is this: if we know/assume that the regression curve is a straight line: $E[Y \mid X]=a X +b$, then the data {$Cov(X,Y)=0$, $E[X]=0$, $E[Y]=0$} implies the constants $a,b$ are zero, and then, yes, $E[Y \mid X]=0$

0
On

Let $y$ and $x$ be independent with means $E[y]=a,E[x]=0$, then $E[y|x]=E[y]=a \neq 0$, yet by independence $cov(y,x)=0$. By switching the expected values of $x,y$ we can get a counterexample to the other direction as well.

Also, $x$ being deterministic has no effect on this argument, since covariance of any rv is with a constant is $0$: $E[Xc]-E[X]E[c] = 0$

1
On

If $E(y|x) = 0$, then $E[y] = E[E[y|x]] = 0$ and $E[xy] = E[xE[y|x]] = 0$, so $\text{cov}(x,y) = 0$.

Conversely if $\text{cov}(x,y) = 0$ and $E[x]=0$ or $E[y]=0$ then $E[xy] = 0$. But that does not imply $E[y|x] = 0$. For example, consider a case where $y = 1$ a.s. while $E(x) = 0$. We then have $E[y|x] = 1$.