Let $X_1,...,X_n$i.i.d.~$Pois(\theta)$ with unknown $\theta>0$ and $n\geq2$. Is $\hat{\gamma}=(\frac{n-1}{n})^T$, where $T=\Sigma_{i=1}^{n}X_i$, the UMVUE for $f(\theta)=e^{-\theta}$? How can I show it is consistent?
To show the estimator is consistent, I want to show that $\mathbf{E}_\theta[(\hat{\gamma}-f(\theta))^2]=0$ as n goes to infinity. However I am having a hard time evaluating this expectation.
As you suggest, you can show the mean squared error goes to zero, which would be sufficient for consistency. Alternatively, you can use WLLN and CMT:
Define $a_n=\left(1- {1\over n}\right)^{n},a=1/e,b_n={\frac{1}{n}\sum_{i=1}^n X_i},b=\theta.$
Note that $$\hat\gamma={a_n}^{b_n}.$$
WLLN tells us $b_n$ converges in probability to $b$. Meanwhile, $a_n$ converges to $a$. So CMT tells us that $\hat \gamma$ converges in probability to $a^b=e^{-\theta}$ and hence is consistent for $f(\theta).$