Maximum Likelihood Estimator of Uniform($-2 \theta, 5 \theta$)

3.1k Views Asked by At

Let $X = (X_1, \dots, X_n)$ be a random sample from the Uniform($-2 \theta, 5 \theta$) distribution with $\theta > 0$ unknown. Find the maximum likelihood estimator (MLE) for $\theta.$ Furthermore, determine whether the MLE $\hat{\theta}$ is a function of a one-dimensional sufficient statistic for $\theta.$

Let $M = \max{ \{X_1, \dots, X_n \}}$ and $L = \min{ \{X_1, \dots, X_n \}}.$ Consider the likelihood function of $\theta$ $$L(\theta; x) = \prod_{k=1}^{n} f(x_k ; \theta) = \prod_{k=1}^{n} \frac{1}{7 \theta} \cdot \mathbf{1}_{(-2 \theta, 5 \theta)}(x_k) = \frac{1}{(7 \theta)^n} \cdot \mathbf{1}_{(-2 \theta, 5 \theta)}(m) \cdot \mathbf{1}_{(-2 \theta, 5 \theta)}(\ell) \cdot \prod_{k=1}^{n} \mathbf{1}_{\mathbf{R}}(x_k).$$ By the Factorization Theorem, it follows that $(M, L)$ is sufficient for $\theta,$ and in fact, it is easy to show that $(M, L)$ is minimal sufficient for $\theta.$ Our candidates for the MLE include $M,$ $L,$ and functions of $M$ and $L,$ e.g., the midrange $\frac{M-L}{2};$ however, I am running into difficulty finding the MLE and establishing that it gives a maximum.

On first glance, it appeared that $\hat{\theta} = L$ because $m \geq \ell$ implies that $\frac{1}{(7m)^n} \leq \frac{1}{(7 \ell)^n};$ however, this is only true if $m \geq \ell > 0.$

Reading a few other posts on here, I considered the possibility that the midrange $\frac{M-L}{2}$ is the MLE for $\theta.$ Of course, the difficulty arises out of the fact that there are many possibilities for $L$ and $M$: $m \geq \ell > 0,$ $m \geq 0 > \ell,$ and $0 \geq m > \ell,$ to name a few.

Can anyone offer any helpful insight or tips?

2

There are 2 best solutions below

13
On BEST ANSWER

The likelihood function is $(7\theta)^{-n}$ for $-2\theta <L$ and $5\theta>M,$ otherwise zero. So the maximum likelihood value of theta is the smallest value of $\theta$ satisfying $-\theta <L/2$ and $\theta>M/5.$ Thus it is $\max\{M/5,-L/2\}.$

2
On

The support of $L(\theta; x)$ is given by $L\ge -2 \theta$ , $M\le 5\theta$; or, equivalently $\theta \ge M/5$ and $\theta \ge -L/2$. Or $$\theta \ge T \triangleq \max(M/5,-L/2)$$

Because over its support $L(\theta; x)$ (for $n>1$) is decreasing, then $ \theta_{ML}=T$


Regarding $(M,L)$ being or not minimal sufficient:

You say " it is easy to show that $(M,L)$ is minimal sufficient for $\theta$" but I don't think that's true.

$$L(\theta; x)=\frac{1}{(7\theta)^n}\mathbf{1}_{[M \le 5 \theta]} \mathbf{1}_{[L \ge -2 \theta]}=\frac{1}{(7\theta)^n} \mathbf{1}_{[T\le\theta]} $$ tells us that both $(M,L)$ and $T$ are sufficient. $T$ is clearly minimal. Because $T=f(M,L)$ (but not the reverse) then $(M,L)$ cannot be minimal.

Put in other way, consider some $x_1$ with $(M_1,L_1)=(100,-2)$ and some $x_2$ with $(M_2,L_2)=(100,-4)$, so that $T_1=T_2=20$

Then $\frac{L(\theta; x_1)}{L(\theta; x_2)}=1$ (doesn't depend on $\theta$), but $(M_1,L_1)\ne (M_2,L_2)$, hence $(M,L)$ is not minimal.