Maximum Likelihood Estimator for Poisson Distribution

2.2k Views Asked by At

Studying for my upcoming exams I came upon this weird MLE exercise :

Let the random variable $X$ follow the Poisson Distribution with unknown parameter $\theta >0$. In $50$ observations of $X$, you only know that $20$ of them are zero. Find the Maximum Likelihood Estimator of $\theta$ using only this fact.

How would one proceed to finding the MLE using the fact that you have $20$ zero observations ? I have never came upon such a problem of MLE.

3

There are 3 best solutions below

1
On BEST ANSWER

You have $X_1,\ldots,X_{50} \sim\operatorname{i.i.d. Poisson}(\lambda),$ so $\Pr(X_1=0) = e^{-\lambda}.$

The number of successes in $50$ trials with probability $e^{-\lambda}$ of success on each trial has a binomial distribution. So one has \begin{align} L(\lambda) & = \Pr(\text{exactly 20 such “successes'' in 50 trials)} \\[10pt] & = \binom{50}{20} \left( e^{-\lambda} \right)^{20} \left( 1 - e^{-\lambda}\right)^{30}. \end{align} We seek the value of $\lambda$ that minimizes that. Letting $p= e^{-\lambda},$ we can find the value of $p$ that minimizes the expression above, by writing \begin{align} \log L(\lambda) & = 20 \log p + 30\log(1-p) \\[10pt] \frac d{dp} \log L(\lambda) & = \frac{20} p - \frac{30}{1-p} = \frac{50((20/50) - p)}{p(1-p)} =\begin{cases} >0 & \text{if } 0 \le p < 20/50 \\ <0 & \text{if } 20/50 < p \le 1. \end{cases} \end{align}

0
On

Hints:

  • Given $\theta$, what is the probability of a zero in a particular observation? Of a non-zero?
  • Given $\theta$, what is the probability of $20$ zeros and $30$ non-zeros?
  • What is the likelihood proportional to?
  • Which $\theta$ maximises this likelihood?
7
On

Based on your description, your data can be summarized as $(X_i, \delta_i), i = 1, \dots, 50$, where \begin{align} \delta_i = \begin{cases} 1 & \text{ if } X_i = 0, \\ 0 & \text{ if } X_i > 0. \end{cases} \end{align}

If $\delta_i = 1$, then the contribution of the observation to the likelihood is $P(X_i = 0) = e^{-\theta} = e^{-\theta\delta_i}$. If $\delta_i = 0$, then the contribution of $X_i$ to the likelihood is $P(X_i > 0) = 1 - e^{-\theta} = (1 - e^{-\theta})^{1 - \delta_i}$. Each of these two cases may be written in a unified way as $(e^{-\theta})^{\delta_i}(1 - e^{-\theta})^{1 - \delta_i}$.

In summary, the likelihood function takes the form

$$L(\theta) = \prod_{i = 1}^{50} e^{-\theta\delta_i}(1 - e^{-\theta})^{1 - \delta_i}. $$

Or, equivalently, the log-likelihood $$\ell(\theta) = \sum_{i = 1}^{50}(-\theta \delta_i + (1 - \delta_i)\log(1 - e^{-\theta})) = -20\theta + 30\log(1 - e^{-\theta}), \tag{*}$$ where we used the condition $\sum_{i = 1}^{50}\delta_i = 20$.

You can then find the MLE $\hat{\theta}$ by maximizing $(*)$ with respect to $\theta$.