Why does the Method of Maximum Likelihood always give a maximum?

100 Views Asked by At

A very simple question. My assumption is that it always gives a maximum because the logarithmic function is monotonically increasing. So differentiating will give a max as opposed to a min. Is this true?

1

There are 1 best solutions below

0
On BEST ANSWER

No. The fact that the logarithmic function is monotonically increasing only means that $\log L(\theta)$ will reach it's maximum at the value $\theta^*$ where $L(\theta)$ itself reachs its maximum; and the same it's true for minima (if we only consider values of $\theta$ such that $L(\theta)>0$.)

The logarithm only appears for convenience, but in either case you're looking at the same time for the point where both $L$ and $\log L$ reach their maxima.

I guess you're considering the particular case where $L$ is differentiable and positive, and there's only one point where $L'(\theta)=0$. In those cases, the argument should focus on the fact that $L$ increases first and decreases after the critical point.