Maximum Likelihood Estimator of $\theta$ for $f(x,\theta)=e^{\theta-x}, x>\theta$

509 Views Asked by At

In a recent test, I was expected to calculate the MLE for $\theta$ in a simple random sample of size $n$ where each individual is independent and follows the distribution:$$f(x,\theta)=\begin{cases}e^{\theta-x},&x>\theta\\0,&\text{o/w}\end{cases}$$The likelihood function is $L(\theta)=e^{n(\theta-\bar x)}\mathbf 1_{x_{(1)}>\theta}$ where $\bar x$ is the sample mean and $x_{(1)}$ is the sample minimum.

Quite evidently, the likelihood function is positive and strictly increasing for $\theta<x_{(1)}$ but $0$ for $\theta\ge x_{(1)}$ and so the maximum is never achieved. The MLE should not exist but my teacher says it is $x_{(1)}$.

1

There are 1 best solutions below

2
On

The question is that in several textbooks MLE is WRONGLY DEFINED as

$$\operatorname{argmax}_{\theta}L(\theta)$$

while the correct definition is

$$ \bbox[5px,border:2px solid black] { \hat{\theta}_{ML}=\operatorname{argsup}_{\theta}L(\theta) \ } $$

Thus the MLE can exist and not belonging to the likelihood domain...(it must belong to its euclidean closure)


Look at the following example.

Using a simple random sample $X_1,...,X_n$ form the following distributions

1.

$$f_X(x)=\frac{1}{\theta}\mathbb{1}_{[0;\theta]}(x)$$

2.

$$f_X(x)=\frac{1}{\theta}\mathbb{1}_{(0;\theta)}(x)$$

Derive the ML Estimator for $\theta$

As $P(X=\theta)=0$ it is evident that the estimator of the parameter cannot be different in the two cases.

Actually, in both cases $\hat{\theta}_{ML}=X_{(n)}$


Do not forget that $X_{(1)}$ or $X_{(n)}$ are still rv's, thus Capital letter is required