Suppose that $Y_1, ..., Y_n$ is a random sample from a Poisson distribution with unknown mean $\theta$ and let $T = \sum_{i = 1}^n Y_i$. Find the constant $c$ such that the estimator $e^{-cT}$ is unbiased as an estimator of $e^{-\theta}$.
To show that the estimator is unbiased, I want to show that $E_\theta(e^{-cT}) = e^{-\theta}$, where $T$ has a Poisson($n\theta$) distribution. The m.g.f. of the poisson distribution with mean $\theta$ is given as $M(t) = e^{\theta(e^t - 1)}$
I understand that
$E(Y) = M'(0)$
$E(Y^2) = M''(0)$
But what is $E(e^{-cT})$?
So the solution points out that "note that $E(e^{-cT}) = e^{n\theta(e^{-c} -1)}$ (the mgf of a Poison$(n\theta)$ distribution at $c$. But why?
In general, we can compute $E(e^{-cT)}$, where $T$ is Poisson with mean $\mu$, directly by using the formula $f(t)=e^{-\mu}\mu^t/t!$ for the pmf of $T$: \begin{align*} E(e^{-cT}) &= \sum_{t=0}^\infty e^{-ct}f(t)\\ &=\sum_{t=0}^\infty e^{-ct} e^{-\mu} \mu^t/t!\\ &= \sum_{t=0}^\infty e^{-\mu} (e^{-c} \mu)^t/t!\\ &= e^{\mu(e^{-c}-1)}\sum_{t=0}^\infty e^{-e^{-c}\mu} (e^{-c} \mu)^t/t!\\ &= e^{\mu(e^{-c}-1)} \end{align*} where the last step comes from identifying $e^{-e^{-c}\mu}(e^{-c}\mu)^t/t!$ as the pmf of a Poisson with mean $e^{-c}\mu$.