Let's consider i.i.d random variables $X,Y$. I want to calculate $E[X|X^2+Y^2]$ in two cases :
- Assuming that $X,Y \sim N(0,1)$
- Assuming that $X,Y \sim U(0,1)$
My work so far
Thing which is very important to notice is that $E[X^2 \mid X^2+Y^2]=E[Y^2 \mid X^2+Y^2]=\frac{X^2+Y^2}{2}$.
I thought it would be a good idea now to calculate $Var[X \mid X^2+Y^2]$ due to known value of $E[X^2|X^2+Y^2]$ and appearance $E[X|X^2+Y^2]$ within that variance. However I met the problem that I don't know exact value of $Var[X \mid X^2+Y^2]$. Could you please give me a hint if I'm going in right direction ? Is there any simpler way how it can be done ?
Here is what I would try: First, given two absolutely continuous random variables $X_1$ and $X_2$ with joint density function $f_{X_1,X_2}(x_1,x_2)$ one has $$f_{X_1|X_2}(x_1|x_2)= \frac{f_{X_1,X_2}(x_1,x_2)}{f_{X_2}(x_2)}.$$ From the above, interchanging the roles of $X_1$ and $X_2$, we have $$f_{X_1|X_2}(x_1|x_2)= \frac{f_{X_2|X_1}(x_2|x_1) f_{X_1}(x_1)}{f_{X_2}(x_2)}.$$
Take $X_1=X$ and $X_2=X^2+ Y^2$. Then $$f_{X|X^2+Y^2}(x|y)= \frac{f_{X^2+Y^2|X}(y|x) f_{X}(x)}{f_{X^2+Y^2}(y)}.$$
Hence, so far, $$E[X|X^2+Y^2] = \int_{-\infty}^{\infty} xf_{X|X^2+Y^2}(x|y)dx \Bigg|_{y=X^2+Y^2}=\int_{-\infty}^{\infty} xf_{X|X^2+Y^2}(x|X^2+Y^2)dx.$$
Now, look at $f_{X^2+Y^2|X}(y|x)$. This one should be easy to reduce to a simpler expression since, knowing $X=x$ should reduce $X^2+Y^2$ to a univariate density. Indeed, $$P(X^2+Y^2\leq y |X=x)=P(Y^2\leq y-x^2 |X=x).$$ Differentiating we see that $$f_{X^2+Y^2|X}(y|x)=f_{Y^2|X}(y-x^2|x) = f_{Y^2}(y-x^2).$$ In the last step we used that $Y^2$ and $X$ are independent.
As a summary $$E[X|X^2+Y^2=y] = \int_{-\infty}^{\infty} x\frac{f_{X^2+Y^2|X}(y|x) f_{X}(x)}{f_{X^2+Y^2}(y)}dx = \frac{1}{f_{X^2+Y^2}(y)}\int_{-\infty}^{\infty}x f_{Y^2}(y-x^2) f_X(x)dx.$$ Remark: As you know the notation $E[Z|W=w]$ is an abuse of notation. It actually means $E[Z|\sigma(W)]=F(W)$ for some Borel measurable function $F$ and $E[Z|W=w]$ has the meaning $F(w)$.
Facts:
If $X$ and $Y$ are i.i.d. standard normal then $X^2+Y^2$ is $\chi_2^2$ and $Y^2$ is $\chi_1^2$.
If $X$ and $Y$ are i.i.d. uniformly distributed then $X^2$ and $Y^2$ are beta distributed with parameters $1/2$ and $1$, but I do not know any known distribution for the sum of two independent beta distributions. Here, you may have to work it out on your own.
Remark: For the normal case observe that $f_X(-x)=f_X(x)$ is an even function, hence $$(-x) f_{Y^2}(y-(-x)^2) f_X(-x) =-x f_{Y^2}(y-x^2) f_X(x)dx$$ is an odd function. Therefore, $$E[X|X^2+Y^2]=0.$$
For the case of uniform distributed r.v.'s, it might be hard to compute the integral in closed form, so you may need to use numerical integration to find the exact value. Or you can try using the fact that $Y\sim Unif(0,1)\Rightarrow Y^2 \sim Beta(1/2,1)$. The beta distribution has density: $f_{Y^2}(y)=\frac{1}{2}y^{-1/2}$. Hence, $$E[X|X^2+Y^2=y] = \frac{1}{2}\frac{1}{f_{X^2+Y^2}(y)}\int_{0}^{1} {\bf 1}_{\{0\leq y-x^2\leq 1\}} \frac{x}{\sqrt{y-x^2}} dx,$$ and try some smart change of variables. To me this one looks related to the $arcsin$ or something. You should use a change of variables like $x=\sqrt{y}\cos(z)$ for example.
I hope this helps.