Let $X, Y, Z$ are square-integrable random variables. Let also that $Z$ is $\sigma(Y)$-measurable and $X$ is not $\sigma(Y)$-measurable.
Are the following statements equivalent?
- $X-Z$ not correlate with any square-integrable $\sigma(Y)$-measurable random variable $W$
- $X-Z$ is independent with any square-integrable $\sigma(Y)$-measurable random variable $W$
I tried to solve the problem through a conditional expectation, and used the fact that if $\mathbb E(X \mid Y) = const$ then $X$ and $Y$ are independent. It is also true in the opposite direction. But it seems that this does not work in general, and I am at a dead end.
The two conditions are not equivalent. The idea of the example below is taken from Dilip Sarwate's comment on this question.
Let $(X,Y)$ be uniform on the ball $B(0,1)$ in $\mathbb{R}^{2}$. That is, if $h$ is any bounded, continuous function in $\mathbb{R}^{2}$, then \begin{equation*} \mathbb{E}(h(X,Y)) = \omega_{2}^{-1} \int_{B(0,1)} h(x,y) \, dx \, dy, \end{equation*} where $\omega_{2}$ is the area of $B(0,1)$. Notice that $X$ and $Y$ are not independent.
Let $Z = 0$. Observe that (2) fails since $X - Z = X$ is not independent of $Y$, even while $Y$ is a square integrable $\sigma(Y)$-measurable random variable.
Recall that $W$ is $\sigma(Y)$-measurable if and only if there is a Borel function $k : \mathbb{R} \to \mathbb{R}$ such that $W = k(Y)$. It follows that (1) holds. Indeed, given such a square integrable $W$, we can write \begin{equation*} \mathbb{E}((X - Z)W) = \mathbb{E}(XW) = \omega_{2}^{-1} \int_{B(0,1)} x k(y) \, dx \, dy = \omega_{2}^{-1} \int_{-1}^{1} \left[ \int_{-\sqrt{1 - y^{2}}}^{\sqrt{1 - y^{2}}} x \, dx \right] k(y) \, dy = 0. \end{equation*}
This gives an example where (1) holds, but (2) does not.