How to find the Most likelihood estimator for

58 Views Asked by At

$f\left( x\right) =\begin{cases}e^{-\left( x-\theta \right) }, x \ge \theta\\ 0\end{cases}$

We want to find the estimator for $\theta$. I already have the solution but I don't understand it.

They first write this $ f(x;\theta) = e^{(\theta-x)} \cdot 1_{(x \ge 0)}$ I don't understand why and what it means so I would like clarifications on that.

Then they find the joint mass function (I am not going to enter into details here because I know what's going on mostly $f(x;\theta) = \exp \left\{ n\theta -\sum ^{n}_{i=1}x_{i}\right\} \cdot 1_{\left( x_{1}\ge 0\right) }$. Now why is it $x_1$ under the $1$ instead of $x_i$ ? Why do we take the smallest value?

They than proceed in saying that: "The maximum likelihood estimator for $\theta$ is simply the value of θ that maximizes this function. Now, seen as a function of θ, it is increasing up to the point θ = $x_{(1)}$, after which it becomes 0. So it's at this point that it's maximized, which means that $\widehat {\theta } = X_1$"

I don't understand their statement, how is $\theta$ going to equal the smallest value if $n$ is increasing?

1

There are 1 best solutions below

0
On BEST ANSWER

First, your question has a typo: $f(x,\theta)$ should be $\exp(\theta-x)1_{x\ge\theta}.$ The notation $1_{x_i\ge\theta}$ is read as indicator that $x_i>\theta.$ It means the following: $$1_{x_i\ge\theta}=\begin{cases}1\hspace{2cm}\text{if }x_i\ge\theta \\0\hspace{2cm}\text{otherwise.}\end{cases}$$ Let us study the joint distribution now. It is known that joint of independent random variables becomes product of their distributions. Then the joint is $$f(x_1,...,x_n;\theta)=\displaystyle\exp(n\theta-\sum_{i=1}^n x_i)\prod_{i=1}^n1_{x_i\ge\theta}.$$ Look at the product $\prod_{i=1}^n1_{x_i\ge\theta}.$ It can be $1$ or $0.$ When is it $0?$ Whenever any one of the $x_i$'s is less than $\theta.$ So the product is $1$ if and only if all the $x_i$'s are larger than $\theta,$ that is, if and only if $x_{(1)}=\min x_i\ge\theta.$

We have now answered the question of how $x_{(1)}$ comes up. Now look at the joint likelihood as a function of $\theta.$ $\exp(n\theta)$ is increasing. If $\theta<x_{(1)},$ likelihood function is positive and increasing in $\theta,$ and if $\theta>x_{(1)},$ likelihood becomes zero. To maximize the likelihood, we take $\theta=x_{(1)}.$