Maximum Likelihood Estimate and Second derivative test?

2.9k Views Asked by At

We're doing MLE's in my prob stats class currently, and my professor insists that after we get the derivative of the Ln of L we still need to take the second derivative to check if it's the MLE. My book however says that the derivative of the Ln of L is always the MLE. He refuses to let us skip this step without a Theorem/proof for this, but I can't find one. Can anyone point me towards one, or correct me if our book is wrong?

1

There are 1 best solutions below

1
On BEST ANSWER

The MLE is simply the global maximum of a function (the likelihood , as a function of the parameter ): $L(\theta)$

You are supposed to know, not from Statistics but from Calculus, that, in general, the global maximum of a function cannot simply be found by deriving the function and equating it to zero. That only gives as a critical point (perhaps several). It can well happen that

  1. the function is not derivable in some points of the domain
  2. the (some) critical point is not a maximum
  3. the global maximum occurs on the boundary of the domain

The estimation of an uniform variable on $[0,\theta]$ is an example of the last situation.

Granted, if you know that

  1. $L(\theta)$ is derivable in all its domain
  2. there is a single critical point
  3. the domain of the parameter is the whole real line
  4. $L(\theta)$ tends to zero at $\pm \infty$

then you know that the critical point is the MLE. If you know the first 3 conditions but not the fourth, then you should compute the second derivative. If you only know $1.$ and $2.$ then you should also check the boundaries of the domain.