I have this question that I feel is completely out of the blue. We are studying the first subjects in statistics (bias, mean squared error, consistence) and we have been given the task to compare the following estimators for $P(X=0)$ where $X\sim Pois(\lambda)$: $$ \hat{\theta}=e^{-\bar{x}}$$ $$\hat{\zeta}=\frac{\sum_{i=1}^{n}{I(X_{i}=0)}}{n}$$
$I$ being the indicator r.v.
I know sum of Poisson r.v is Poisson, and that I have to use the moment generating function with the first estimator, but I only found the expected value of the second one, and have not been able to get anywhere finding the expected value of the first. Same with the MSE and checking consistency.
It is easy to see that $\mathbb E[\hat\zeta] = \mathbb P(X=0)$ and $\mathbb E[\hat\zeta^2] = \mathbb P(X=0)$. That is, the estimator is unbiased. Thus, the MSE of $\hat\zeta$ is given by $\mathbb n^{-1}P(X=0)-n^{-1}\mathbb P(X=0)^2$. The MSE converges to zero as $n$ approaches infinity. This implies that $\hat\zeta$ converges in probability to $\mathbb P(X=0)$.
Now consider $\hat\theta$. By definition of the Poisson distribution it holds that $$\mathbb E[\hat\theta] = \sum_{k=0}^\infty\hat\theta\frac{(n\lambda)^k}{k!}\exp(-n\lambda) = \sum_{k=0}^\infty\exp(-kn^{-1})\frac{(n\lambda)^k}{k!}\exp(-n\lambda) = \exp(-n\lambda+n\lambda\exp(-n^{-1})).$$ Furthermore, it holds that $\mathbb E[\hat\theta^2] = \exp(-n\lambda + n\lambda\exp(-2n^{-1}))$. This estimate is biased, which makes computing the MSE tedious... Since $\exp$ is continuous and the Poisson distribution has finite second moments, the law of large numbers in conjunction with the continuous mapping theorem shows that $\hat\theta$ converges in probability to $\exp(-\lambda)$ ($=\mathbb P(X=0)$). So $\hat\theta$ is a consistent estimator.