If $X \sim POI(\mu)$, show that $S = (-1)^X$ is the UMVUE of $e^{-2\mu}$. Is this a reasonable estimator?
UMVUE stands for uniformly minimum-variance unbiased estimator
The pdf of the Poisson distribution is $P\left( x \right) = \frac{{e^{ - \mu} \mu^x }}{{x!}}$
How would you do this problem?
It's easy to see that $(-1)^X$ is the only unbiased estimator: if $f(X)$ is an unbiased estimator, $$Ef(X)=\sum^\infty_{k=0}f(k)P(X=k)=e^{-\mu}\sum^\infty_{k=0}\frac{f(k)}{k!}\mu^k=e^{-2\mu}$$implies $$\sum^\infty_{k=0}\frac{f(k)}{k!}\mu^k=e^{-\mu}=\sum^\infty_{k=0}\frac{(-1)^k}{k!}\mu^k,$$ so we must have $f(k)=(-1)^k$ by the uniqueness theorem for power series.
Of course, it's an estimator as poor as one can expect from sample size $1$. If we have $n$ independent $X_i \sim POI(\mu)$, we can easily see that $\displaystyle S_n=\sum^n_{i=1}X_i$ is a sufficient statistic, so $E((-1)^{X_1}|S_n)$ would be the UMVUE. That's not hard to calculate, along the same lines as in the example in the Wikipedia article about the Rao–Blackwell theorem, but that's not really necessary: for $X \sim POI(\mu)$, we have $\displaystyle E\alpha^X=e^{(\alpha-1)\mu}$ (the above is the special case $\alpha=-1$). Now $S_n \sim POI(n\mu)$, so $\displaystyle E\alpha^{S_n}=e^{(\alpha-1)n\mu}=e^{-2\mu}$ exactly for $\displaystyle\alpha=1-\frac2n$, i.e. our estimator is $$f(S_n)=\left(1-\frac2n\right)^{S_n},$$ looking far more reasonable, especially since with $n\to\infty$, $\displaystyle\left(1-\frac2n\right)^n\to e^{-2}$ and $\displaystyle\frac{S_n}n\to\mu$ a.s. according to the strong law of large numbers, and thus $\displaystyle f(S_n)\to e^{-2\mu}$ a.s.