Likelihood ratio test: $X$ has the density function $g(x)=\frac{1}{\theta}e^{-\frac{x}{\theta}}; x>0, 0<\theta\leq 1$. Find a test for the hypothesis' $H_0(\theta=1)$ against $H_1(0<\theta<1)$
Likelihood function: $L(\theta,x_1,x_2,\ldots)$
I have that:
$\Lambda_0 \subset\Lambda; H_0(\theta\in\Lambda_0),H_1(\theta \in \Lambda \setminus \Lambda_0).$
$\theta^*$- value of parameter $\theta$ for which the function $L(\theta,x_1,x_2,\ldots), m\in\Lambda_0$ reaches its maximum value.
$\theta^{**}$- value of parameter $\theta$ for which the function $L(\theta,x_1,x_2,\ldots)$, $m\in\Lambda$ reaches its maximum value.
Now using the method of maximum likelihood I get as usual the estimator $\overline{X}_n$. Shouldn't this be $m^{**}?$
But here, it says that $m^{**}=\min\{\overline{X}_n,1\}$. I do not understand this at all. If someone could clarify why this is, it would mean a lot, because this keeps coming up. I understand why, that in this case $m^*=1$ (because if the simple hypothesis)
The likelihood is $$ L(\theta) = \frac 1 {\theta^n} \exp\left( -\frac{x_1+\cdots+x_n} \theta \right) \quad\text{for } 0<\theta\le 1. $$ Its logarithm is $$ \ell(\theta) = -n\log\theta - \frac{x_1+\cdots+x_n} \theta \quad\text{for } 0<\theta\le 1. $$ Differentiating, we get $$ \ell\,'(\theta) = \frac{-n} \theta + \frac{x_1+\cdots+x_n}{\theta^2} = \frac{(x_1 + \cdots + x_n) - n\theta}{\theta^2} = \frac{\overline x -\theta}{\theta^2/n}. $$ If $\,\overline x >1$ then $\ell\,'(\theta)>0$ for all $\theta\in(0,1]$. Consequently the value of $\theta\in(0,1]$ that maximizes $L(\theta)$ is $\theta=1$. If $\theta=1$ then one can reach the same conclusion despite the fact that $\ell\,'(1) \not > 0$ (but $\ell\,'(\theta)>0$ for $0<\theta<1$).
If $\,0<\overline x <1$ then $\ell\,'(\theta) >0$ for $\theta<\overline x$ and $\ell\,'(\theta)<0$ for $\overline x < \theta< 1$, and therefore the value of $\theta$ that maximizes $L(\theta)$ is $\overline x$.
Bottom line: The value of $\theta$ that maximizes $L(\theta)$ is $\min\{\,\overline x, 1\,\}$.