Maximum likelihood estimator of $\theta$, $f(x;\theta) = 1/2 e^{-|x - \theta|}$

5.7k Views Asked by At

I'm given $f(x;\theta) = \frac12 e^{-|x - \theta|}$, $-\infty < x < \infty$ and $0 < \theta < \infty$. I want to find the maximum likelihood estimator of $\theta$. I found:

$$\ln L(\theta; x_1,..., x_n) = -n \ln 2 - \sum |x_i - \theta|$$

Usually I would differentiate and find the maximum. Here differentiation does not work. But by inspection, $\sum |x_i - \theta|$ is always positive so $L$ has a maximum when $\sum |x_i - \theta| = 0$. But how can I express $\theta$ in terms of the $x_i$'s?

1

There are 1 best solutions below

1
On BEST ANSWER

Hint: Suppose that the $x_i$'s have been reindexed so that they are increasing order: $x_1\leq x_2\leq\cdots\leq x_n$. Suppose that $x_k<\theta<x_{k+1}$. Then, $$ \ln(L;x_1,\cdots,x_n)=-n\ln 2-\sum_{i=1}^k(\theta-x_k)-\sum_{i=k+1}^n(x_k-\theta). $$ Taking the derivative with respect to $\theta$ gives $$ -\sum_{i=1}^k(1)-\sum_{i=k+1}^n(-1)=-k+(n-k)=n-2k. $$ Therefore, if $k<\frac{n}{2}$, then as $\theta$ grows, the derivative is positive, so $L$ increases. If $k>\frac{n}{2}$, then as $\theta$ grows, the derivative is negative, so $L$ decreases. This mimics the first derivative test for a maximum.