Suppose we have that $X=E[X\mid Y]$ where $X \in L^1$ \begin{align*} Y=X+Z \end{align*} where $X$ and $Z$ are independent and $L^1$. What can we say about $X$? Is $X$ a constant in this case?
I guess a very interesting case is when $Z$ is standard normal random variable. This question is related to If $X=E[X\mid Y]$ what can we say about $X$ and $Y$.
Possible approach \begin{align*} &X=E[X\mid Y] \Longleftrightarrow X-E[X\mid Y]=0 \\ & \Longleftrightarrow (X-E[X\mid Y])^2=0 \Longleftrightarrow E[(X-E[X\mid Y])^2]=0 \end{align*}
Note, that $E[(X-E[X\mid Y])^2]$ is the mmse error of estimating $X$ in Gaussian noise. I don't know much about mmse. So, the next question is mmse equal $0$ only if $X$ is constant?
If $X$ and $Z$ are independently normally distributed with means $\mu_X$ and $\mu_Z$ and variances $\sigma^2_X$ and $\sigma^2_Z$ respectively and $Y=X+Z$ then $$E[X|Y=y]= \mu_X +(y-\mu_X-\mu_Z)\dfrac{\sigma^2_X}{\sigma^2_X+\sigma^2_Z}.$$
If $X=E[X|Y]$ then $X=\mu_X +(X+Z-\mu_X-\mu_Z)\dfrac{\sigma^2_X}{\sigma^2_X+\sigma^2_Z}$ so $X=\mu_X +(Z-\mu_Z)\dfrac{\sigma^2_X}{\sigma^2_Z}$ which would mean that $X$ and $Z$ were not independent (unless $\sigma^2_X=0$, in which case $X$ would be almost surely constant).
This only deals with the normal case, but it seems likely to me that in general in the case of continuous random variables $E[X|X+Z]$ will be a function of $Z$. $X$ will then not be not independent of $Z$ unless $X$ or $Z$ is almost surely constant.
But suppose you had the following joint distribution of $X, Y, Z$ and $E[X|Y]$ as discrete random variables, where $X$ and $Z$ are independent, and where for each value of $Y$ there is only one possible value of $X$ (or at least all others have a probability of $0$):