Suppose $X=E[X\mid X+Z]$ where $X$ and $Z$ are independent

161 Views Asked by At

Suppose we have that $X=E[X\mid Y]$ where $X \in L^1$ \begin{align*} Y=X+Z \end{align*} where $X$ and $Z$ are independent and $L^1$. What can we say about $X$? Is $X$ a constant in this case?

I guess a very interesting case is when $Z$ is standard normal random variable. This question is related to If $X=E[X\mid Y]$ what can we say about $X$ and $Y$.

Possible approach \begin{align*} &X=E[X\mid Y] \Longleftrightarrow X-E[X\mid Y]=0 \\ & \Longleftrightarrow (X-E[X\mid Y])^2=0 \Longleftrightarrow E[(X-E[X\mid Y])^2]=0 \end{align*}

Note, that $E[(X-E[X\mid Y])^2]$ is the mmse error of estimating $X$ in Gaussian noise. I don't know much about mmse. So, the next question is mmse equal $0$ only if $X$ is constant?

2

There are 2 best solutions below

2
On BEST ANSWER

If $X$ and $Z$ are independently normally distributed with means $\mu_X$ and $\mu_Z$ and variances $\sigma^2_X$ and $\sigma^2_Z$ respectively and $Y=X+Z$ then $$E[X|Y=y]= \mu_X +(y-\mu_X-\mu_Z)\dfrac{\sigma^2_X}{\sigma^2_X+\sigma^2_Z}.$$

If $X=E[X|Y]$ then $X=\mu_X +(X+Z-\mu_X-\mu_Z)\dfrac{\sigma^2_X}{\sigma^2_X+\sigma^2_Z}$ so $X=\mu_X +(Z-\mu_Z)\dfrac{\sigma^2_X}{\sigma^2_Z}$ which would mean that $X$ and $Z$ were not independent (unless $\sigma^2_X=0$, in which case $X$ would be almost surely constant).

This only deals with the normal case, but it seems likely to me that in general in the case of continuous random variables $E[X|X+Z]$ will be a function of $Z$. $X$ will then not be not independent of $Z$ unless $X$ or $Z$ is almost surely constant.

But suppose you had the following joint distribution of $X, Y, Z$ and $E[X|Y]$ as discrete random variables, where $X$ and $Z$ are independent, and where for each value of $Y$ there is only one possible value of $X$ (or at least all others have a probability of $0$):

X Y Z Prob E[X|Y]
0 0 0 1/4    0
1 1 0 1/4    1
0 2 2 1/4    0
1 3 2 1/4    1
3
On

Assuming all the random variables to be continuous and $X\perp Z,\ Y=X+Z$, we have $$E(X|X+Z=y)=\frac{\int_{-\infty}^y xf_X(x)f_Z(y-x)dx}{\int_{-\infty}^y f_X(x)f_{Z}(y-x)dx}$$ When $Z$ has standard normal distribution $$E(X|X+Z=y)=\frac{\int_{-\infty}^\infty xf_X(x)1/{\sqrt{2\pi}} e^{-(y-x)^2/2}dx}{\int_{-\infty}^\infty f_X(x)1/{\sqrt{2\pi}} e^{-(y-x)^2/2}dx}=\frac{E(Xe^{-(y-X)^2/2})}{E(e^{-(y-X)^2/2})}$$So, given condition implies $$Z=\frac{E(Ze^{-Z^2/2}\mid Y)}{E(e^{-Z^2/2}\mid Y)}=E(Z|Y)$$

EDIT: Some further observations. Let us define the real valued functions $$f_{k}(x)=\int_{-\infty}^{\infty}u^kf_X(u)e^{-(x-u)^2/2 }du$$ These functions satisfy the relations $$\frac{df_{k}(x)}{dx}=-xf_k(x)+f_{k+1}(x)$$ Now we can write $$X=\frac{f_1(Y)}{f_0(Y)}=\frac{d}{dY}\ln f_0(Y)+Y=Y+g(Y),\ Z=-\frac{d}{dY}\ln f_0(Y)=-g(Y)$$ Now $X\perp Z\implies E(e^{aX+bZ})=Ee^{aX}Ee^{bZ}\ \forall a,b\in \mathbb{R}$ assuming that the mgf's exists (otherwise we can use the characteristic function). Then $$E(e^{a(Y+g(Y))-bg(Y)})=Ee^{a(Y+g(Y))}Ee^{-bg(Y)}$$ Now, from this if we can somehow show that $g$ must be linear function then we can show that $X$ has to be constant, at least in the case when $X,Z$ are all absolutely continuous random variables. For the case where they can be allowed to be discrete, there are a number of interesting counterexamples showing non-constant $X$, as in the other answer by Henry and in the comments by Did.