I have been given the pdf:
$$f_X (x; \lambda) = \left(\frac{\lambda}{\pi}\right)^{\frac{1}{2}} x^{-\frac{3}{2}} e^{-\frac{\lambda}{x}} $$
with support $x>0$ and $\lambda>0$.
I am asked to find the UMVUE.
Now my current progress is by first observing that the likelihood function is of the form
$$a(\lambda)b(x)e^{c(\lambda)\times \sum_{i=1}^{n}\frac{1}{X_i}}$$
and thus $$S(X)=\sum_{k=1}^{n}\frac{1}{X_i}$$ is a complete, sufficient statistic.
Moreover, since $\frac{1}{X_i}$ follows a $\Gamma(\frac{1}{2},\frac{1}{\lambda})$ distribution, then we can easily see that $$S^*=\frac{2}{n}S$$ is an unbiased estimator.
Now Lehmann Scheffe theorem would say that $T^* = \mathbb{E}(S^* | S) $ is the UMVUE but this seems too hard to compute... Any ideas?
Assuming $X_1,X_2,\ldots,X_n$ are i.i.d with pdf $$f_X(x;\lambda)=\sqrt{\frac{\lambda}{\pi}}e^{-\lambda/x}x^{-3/2}\mathbf1_{x>0}\quad,\,\lambda>0$$
By Lehmann-Scheffe theorem, an unbiased estimator of $1/\lambda$ based on a complete sufficient statistic would be the UMVUE of $1/\lambda$. You have found a complete sufficient statistic $T=\sum\limits_{i=1}^n\frac{1}{X_i}$.
And since $E\left(\frac{1}{X_1}\right)=\frac{1}{2\lambda}$, we have $$E(T)=\sum_{i=1}^n E\left(\frac{1}{X_i}\right)=\frac{n}{2\lambda}$$
This itself shows the UMVUE must be $\frac{2}{n}T$.
Note that $E\left(\frac{2}{n}T\mid T\right)=\frac{2}{n}E(T\mid T)=\frac{2}{n}T$, so this step is redundant.