MLE for uniform distribution with two parameters

304 Views Asked by At

A finite number of random variables $X_1 ... X_n$ have a uniform distribution $[a, b]$ with $b>a$, such that it has the following density: $$f_X(x) = \frac{1}{b-a}, a\le x \le b$$ find the MLE.

Here's what I have tried:

$L(b,a;x) = \prod_{i=1}^nf(x_i;b, a) = \prod_{i=1}^n\frac{1}{b-a} = \left(\frac{1}{b-a}\right)^n \\ \mathbf{L}(b,a;x) = \log L(b,a;x) = -n\log(b-a)$

Taking the first and second derivative in respect to b, and then with respect to a.

$\frac{\partial\mathbf{L}(b,a;x)}{\partial b} = -\frac{n}{(b-a)} \\ \frac{\partial\mathbf{L}(b,a;x)}{\partial a} = \frac{n}{(b-a)} \\ \frac{\partial^2\mathbf{L}(b,a;x)}{\partial b^2} = \frac{2n}{(b-a)^2} \\ \frac{\partial^2\mathbf{L}(b,a;x)}{\partial a^2} = -\frac{2n}{(b-a)^2} \\ \frac{\partial^2\mathbf{L}(b,a;x)}{\partial ba} = -\frac{2n}{(b-a)^2}$

When I set the first two derivatves to zero to find the turning points, and these are when $b=a$. To find the local maximum, I plug these into the hessian matrix:

$$H :=\begin{pmatrix} \frac{\partial^2\mathbf{L}(b,a;x)}{\partial b^2}&\frac{\partial^2\mathbf{L}(b,a;x)}{\partial ba} \\ \frac{\partial^2\mathbf{L}(b,a;x)}{\partial ba}&\frac{\partial^2\mathbf{L}(b,a;x)}{\partial a^2} \end{pmatrix} \implies \begin{pmatrix} \frac{2n}{(b-a)^2}&-\frac{2n}{(b-a)^2} \\ -\frac{2n}{(b-a)^2}&-\frac{2n}{(b-a)^2} \end{pmatrix} \\ \\ = -\left(\frac{2n}{(b-a)^2}\right)\left(\frac{2n}{(b-a)^2}\right) -\left(\frac{2n}{(b-a)^2}\right)\left(\frac{2n}{(b-a)^2}\right) = -\frac{8n^2}{(b-a)^4}$$

Given that $\frac{\partial^2\mathbf{L}(b,a;x)}{\partial b^2}>0$ and $Det(H) < 0$, then our turning point is not a local maximum as the reverse inequality should happen for this to be the case.

However, this cannot be true. The actual question assumes three random variables such that $x_1 = 1.5, x_2 = 4.6, x_3 = 7.2$, and I'm supposed to plug these into the MLE. However, if there is no MLE then I cannot do this. So I have gone wrong somewhere.

1

There are 1 best solutions below

0
On BEST ANSWER

Given a sample $X$, then $(X_{(1)},X_{(n)})$ is the MLE of $(a,b)$ where $X_{(k)}$ are the order statistics. Indeed the likelihood is $$L((\theta_1,\theta_2),X)=\frac{1}{(\theta_2-\theta_1)^n}\prod_{k\leq n}\mathbf{1}_{[\theta_1,\theta_2]}(X_k)$$ which is maximized by $(X_{(1)},X_{(n)})$. To see this, note that $(\theta_2-\theta_1)^{-n}$ grows as $\theta_2-\theta_1\to 0^+$, but $L$ becomes $0$ as soon as the parameters 'pass' at least one of the observations $X_k$ (due to the product of the indicators).