MLE of a function of parameter

2.6k Views Asked by At

I am trying to get my head around the maximum likelihood estimator (MLE) of a function of a parameter.

Say I have $X_i \sim \text{Poisson}(\theta)$ samples. I want to find the MLE of $\pi(\theta) = \exp(-\theta) = P_\theta(X = 0)$.

MLE of the Poisson's parameter is the sample average, i.e. $\bar{X}$. Is

$$ \exp \left( -\bar{X} \right) $$ the MLE of $\pi$ ?

2

There are 2 best solutions below

2
On

If $T$ is a statistic, which is MLE for parameter $\theta$, and $f$ is a continuous one-to-one function then $f(T)$ is the MLE of $f(\theta)$. Prove this by using transformation of variables formula for probability distributions and the definition of MLE. For a quick overview see here.

0
On

Let us assume that the sample $X_1,\ldots,X_n$ is independently and identically distributed (iid), i.e. $$ X_i\overset{\text{iid}}{\sim} \text{Poisson}(\theta),\quad i=1,\ldots,n, $$

and let $\pi = e^{-\theta}$. Then $\theta = -\log\pi$ and thus the likelihood function for $\pi$ is

$$ L(\pi) \propto e^{n\log\pi} (-\log \pi)^{\sum_i X_i}. $$

The log-likelihood function is $$ \ell(\pi) = n\log\pi + \sum_iX_i\log(-\log\pi), $$

and the maximum likelihood estimator (MLE) is the solution in $\pi$ of

$$ \ell^\prime(\pi) = 0 = \frac{n}{\pi} + \frac{\sum_i X_i}{\log\pi}\frac{1}{\pi}. $$

The MLE is thus $\log\hat\pi = -\bar X$ or $\hat\pi = e^{-\bar X}$. But this comes by no surprise since:

the MLE is invariant with respect to reparametrizations.

The claim that the MLE is invariant holds even when the reparametrization is not one-to-one, though in this case, the proof is more involved.