Is there any shortcut for finding $E(X_1\mid \sum_{i=1}^nX_i^2)$ where $X_1,\cdots,X_n$ are i.i.d $\mathcal N(\theta,\theta)$ variables?

92 Views Asked by At

Suppose $X_1,X_2,\cdots,X_n$ are independent and identically distributed $\mathcal N(\theta,\theta)$ random variables where $\theta>0$. I am looking for the conditional mean $E(X_1\mid \sum_{i=1}^nX_i^2)$.

I think this should be equal to $E(\bar X\mid \sum_{i=1}^nX_i^2)$ where $\bar X$ is the sample mean. This is because both $\bar X$ and $X_1$ are unbiased for $\theta$ while $\sum_{i=1}^nX_i^2$ is a complete sufficient statistic for the family of distributions. So by Lehmann-Scheffe theorem, both expectations should be equal as UMVUE is unique whenever it exists.

This reference tells me it should be that $$E(X_1\mid T)=E(\bar X\mid T)=\sqrt{\frac{T}{n}}\frac{I_{n/2}\left(\sqrt{nT}\right)}{I_{n/2-1}\left(\sqrt{nT}\right)}$$

where $T=\sum_{i=1}^nX_i^2$ and $I_\nu(⋅)$ is the modified Bessel function of the first kind of order $\nu$.

The only way I can think of doing this is by finding the conditional density first. But that is a tad cumbersome. Is there any alternative approach?

We have $E(X_1^2\mid \sum_{i=1}^nX_i^2=t)=\frac{t}{n}$ due to the i.i.d nature of the $X_i$'s.

So, \begin{align}\left[E\left(X_1\mid \sum_{i=1}^nX_i^2=t\right)\right]^2&=E\left(X_1^2\mid \sum_{i=1}^nX_i^2=t\right)-\text{Var}\left(X_1\mid \sum_{i=1}^nX_i^2=t\right)\\&=\frac{t}{n}-\text{Var}\left(X_1\mid \sum_{i=1}^nX_i^2=t\right)\end{align}

But that looks like a dead end.