Bain & Engelhardt Ex9.6: Wanted some advice/confirmation if the answer is correct.

40 Views Asked by At

Q: Find MLE based on random sample $X_1, . . , X_n$ from the pdf $$f(x;\theta_1,\theta_2)=\frac{1}{\theta_2 - \theta1} ; \theta_1\le x \le \theta_2$$ and zero otherwise

A: \begin{align} \log f(x)= -\log {\theta_2-\theta_1} \end{align} $$\frac{\partial f}{\partial \theta_1}=\frac{1}{\theta_2-\theta_1}$$ which tells us that the log likelihood function is increasing in $\theta_1$ since the partial derivative $\ge0$ $$\frac{\partial f}{\partial \theta_2}=\frac{1}{\theta_1-\theta_2}$$ which tells us that the log likelihhod function is decreasing in $\theta_2$ since the partial derivative $\le0$

Since we want to maximize likelihood, I think the answer should be $$\hat{\theta_1}= X_{1:n}$$ as $\theta_1 \le x$ and $$\hat{\theta_2}=X_{n:n}$$ as $x \le \theta_2$

1

There are 1 best solutions below

5
On

Welcome to MSE. The formal way is to say that :

$$L(\theta_1,\theta_2) = (\theta_2-\theta_1)^{-n}.$$

Then, as you noticed, the derivatives of $L$ with respect to $\theta_1$ and $\theta_2$ are resp. decreasing and increasing. Then the likehood is maximized for

$$\hat{\theta_2} = \max_i x_i, ~\hat{\theta_1} = \min_i x_i. $$

Hope it helps.