$X, Y, Z$ square-integrable rvs. Let that $Z$ is $\sigma(Y)$-measurable and $X$ is not $\sigma(Y)$-measurable. Prove the equivalence of statements

77 Views Asked by At

Let $X, Y, Z$ are square-integrable random variables. Let also that $Z$ is $\sigma(Y)$-measurable and $X$ is not $\sigma(Y)$-measurable.

Are the following statements equivalent?

  1. $X-Z$ not correlate with any square-integrable $\sigma(Y)$-measurable random variable $W$
  2. $X-Z$ is independent with any square-integrable $\sigma(Y)$-measurable random variable $W$

I tried to solve the problem through a conditional expectation, and used the fact that if $\mathbb E(X \mid Y) = const$ then $X$ and $Y$ are independent. It is also true in the opposite direction. But it seems that this does not work in general, and I am at a dead end.

2

There are 2 best solutions below

0
On

The two conditions are not equivalent. The idea of the example below is taken from Dilip Sarwate's comment on this question.

Let $(X,Y)$ be uniform on the ball $B(0,1)$ in $\mathbb{R}^{2}$. That is, if $h$ is any bounded, continuous function in $\mathbb{R}^{2}$, then \begin{equation*} \mathbb{E}(h(X,Y)) = \omega_{2}^{-1} \int_{B(0,1)} h(x,y) \, dx \, dy, \end{equation*} where $\omega_{2}$ is the area of $B(0,1)$. Notice that $X$ and $Y$ are not independent.

Let $Z = 0$. Observe that (2) fails since $X - Z = X$ is not independent of $Y$, even while $Y$ is a square integrable $\sigma(Y)$-measurable random variable.

Recall that $W$ is $\sigma(Y)$-measurable if and only if there is a Borel function $k : \mathbb{R} \to \mathbb{R}$ such that $W = k(Y)$. It follows that (1) holds. Indeed, given such a square integrable $W$, we can write \begin{equation*} \mathbb{E}((X - Z)W) = \mathbb{E}(XW) = \omega_{2}^{-1} \int_{B(0,1)} x k(y) \, dx \, dy = \omega_{2}^{-1} \int_{-1}^{1} \left[ \int_{-\sqrt{1 - y^{2}}}^{\sqrt{1 - y^{2}}} x \, dx \right] k(y) \, dy = 0. \end{equation*}

This gives an example where (1) holds, but (2) does not.

0
On

This example is another form of idea from Peter Morfe's answer.

Put $Y = \pm 1$ with probability $\frac12$, $\xi$ is independent of $Y$ and has the same distribution, $X = \xi (Y+1)$, $Z=0$, $W_0 = Y$.

It's easy to see that $X-Z = X$ and $W_0 = Y$ are dependent: if $Y=-1$ we have $X = 0$ a.s. and if $Y=1$ then $X \ne 0$ a.s. Hence second statement is false.

But $$E(X-Z)W = EXW = E \xi Y W = E E(\xi Y W| Y) = E YW (E (\xi | Y)) = E YW (E \xi) = 0$$ and hence the first statement is true.

It follows that conditions 1 and 2 are not equivalent.