I am trying to get my head around the maximum likelihood estimator (MLE) of a function of a parameter.
Say I have $X_i \sim \text{Poisson}(\theta)$ samples. I want to find the MLE of $\pi(\theta) = \exp(-\theta) = P_\theta(X = 0)$.
MLE of the Poisson's parameter is the sample average, i.e. $\bar{X}$. Is
$$ \exp \left( -\bar{X} \right) $$ the MLE of $\pi$ ?
If $T$ is a statistic, which is MLE for parameter $\theta$, and $f$ is a continuous one-to-one function then $f(T)$ is the MLE of $f(\theta)$. Prove this by using transformation of variables formula for probability distributions and the definition of MLE. For a quick overview see here.