Find the maximum likelihood estimator for $\theta$ and $\delta$

1k Views Asked by At

Find the maximum likelihood estimator for $\theta$ and $\delta$ considering a random sample of size $n$ from a population with distribution $$f(x)=\frac{1}{\theta}e^\frac{-(x-\delta)}{\theta},\quad x>\delta$$

I know $L(\theta,\delta)=\prod_{i=1}^n \frac 1 \theta e^\frac{-(x-\delta)} \theta$. When I simplify, I get $$L(\theta,\delta)=\frac 1 {\theta^n}e^{-\frac 1 \theta \sum_{i=1}^n x_i}e^{\frac{n\delta} \theta}$$ Taking the natural log of this function, I get $$\ell(\theta,\delta)=-n\ln\theta - \frac 1 \theta \sum_{i=1}^n x_i+\frac{n\delta} \theta$$

Now, taking the partial derivatives for $\ell$ with respect to $\theta$ and $\delta$, I have the following:

$$\frac{d\ell}{d\theta}=-\frac n \theta +\frac 1 {\theta^2}\sum_{i=1}^n x_i-\frac{n\delta}{\theta^2}$$

$$\frac{d\ell}{d\delta}=\frac n \theta$$ Setting these both equal to zero and with some manipulation, I found $\hat\delta=\bar x$ and $\hat\theta=0$, but this does not make sense, because an estimator cannot be $0$. I have seen other examples where similar issues arise, but I'm not sure how I should look at this in order to actually figure out the value of $\theta$ that maximizes the function.

3

There are 3 best solutions below

0
On BEST ANSWER

If you look at your formula for $\log L(\theta,\delta)$, you'll see that $\log L$ is increasing in $\delta$: the bigger the $\delta$ the more likely-looking things seem to be. But the range restriction on $x$ means that $\hat \delta$ must be $<$ than all your observations, and that $\hat \delta=\min_i x_i$. Now look at $\partial/\partial \theta \log L$ and solve for $\hat \theta$.

In a nutshell: the restriction $x>\delta$ means the formula you state for $\log L$ is not valid for all values of the parameters, but your derivative calculation ignored this fact.

0
On

"Setting these both to zero" is a knee-jerk response; some thought is needed.

It is not true that settting derivatives to zero is a universally valid way of finding extremes.

You said $x>\delta.$ That means $\delta<\text{all of }x_1,\ldots,x_n,$ so $\delta< \min\{x_1,\ldots,x_n\}.$ As a function of $\delta,$ the log-likelihood $\ell$ increases until $\delta$ gets up to $\min\{x_1,\ldots,x_n\}.$

Therefore $\widehat{\,\delta\,} = \min\{x_1,\ldots,x_m\}.$

This is an example of what calculus textbooks call an endpoint maximum. Those don't generally occur where the derivative is $0.$

0
On

Hint: You can substitute $x-\delta=y$ to get a more familiar expression. You get $$f(y)=\frac{1}{\theta}e^\frac{-y}{\theta},\quad y>0$$

Thus the Likelihood function is

$$L(\theta)=\prod_{i=1}^n \frac 1 \theta e^\frac{-y_i} \theta$$

$$\ell(\theta )=-n\ln\theta - \frac 1 \theta \sum_{i=1}^n y_i$$