Why does maximum likelihood fail in this simple case?

106 Views Asked by At

$\def\e{\mathrm{e}}$ Suppose one decides to parametrize the exponential probability density in this unorthodox way

$$ f(x; θ) = -θ \e^{θx}, \quad x > 0 $$ where $θ \in (-∞, 0)$.

Then putting the derivative of the log-likelihood with one observation equal to zero gives

$$\frac{-1}{\theta} + x = 0 \iff \frac{1}{x} = \theta $$

this would not be a good estimator of $\theta$ since $1/x > 0$.

From the second derivative of the log-likelihood $\frac{1}{\theta^2} > 0$ we see that we are in presence of a minimum instead of a maximum.

Where did the maximum likelihood procedure go wrong? was it an error in my calculations? the fact that the parameter support is not compact? (but in the ordinary parametrization everything works)

For the existence it should be sufficient that the parameter space is compact and the Likelihood function is continuous on the parameter space.

1

There are 1 best solutions below

0
On BEST ANSWER

The problem is that you've used

$$ \frac{\mathrm d}{\mathrm d\theta}\log(-\theta)=\frac{-1}\theta\;, $$

whereas

$$ \frac{\mathrm d}{\mathrm d\theta}\log(-\theta)=-\left.\frac{\mathrm d}{\mathrm dx}\log x\,\right|_{x=-\theta}=-\left.\frac 1x\right|_{x=-\theta}=-\frac1{-\theta}=\frac1\theta\;. $$