maximum likelihood estimation problem tutorial

219 Views Asked by At

enter image description hereThe question is find the δ by The maximum likelihood estimation? My answer is δ=0 but I am not sure whether it is correct and how tho show its biasness?

2

There are 2 best solutions below

1
On

The likelihood function is $L(\delta) = e^{-\sum_i (x_i - \delta)}$ when all the $x_i \geq \delta$ and $0$ otherwise. Maximizing this is equivalent to maximizing the log likelihood function, which is $l(\delta) = - \sum_i (x_i - \delta)$ when all the $x_i \geq \delta$ and $-\infty$ otherwise. Now, maximize the log-likelihood function.

0
On

This is a tricky question. As stated by Batman, the loglikelihood is indeed $l(\delta) = - \sum_i (x_i - \delta)$. Unfortunately, you cannot just apply calculus, as the loglikelihood is strictly increasing in $\delta$, so the derivative never equals zero leading to a naive estimator of $\delta = \infty$. You get this by differentiating the loglikelihood wrt $\delta$, NOT $x$:

$\frac{dl(\delta)}{d\delta} = n > 0$

However, you cannot increase $\delta$ without bound due to the indicator function in the definition of the density $\mathbf{1}_{[\delta,+\infty)}$. What happens when you increase $\delta$ beyond the smallest observation? Your entire likelihood function becomes zero! Therefore, the likelihood is maximized when $\delta = \min{x_i}$, as the likelihood drops to zero if you set $\delta$ any higher.

So, what about the bias? Given the estimator above, you can qualitatively see that it will be biased high, as you will never see an $x<\delta$ but you will see many $x\geq \delta$. The exact derivation will require taking the expected value of the minimum of N observations. See here for how to derive the distribution of the minimum for a given density function.