I am trying to understand the maximum likelihood estimation on an example.
Given random variables Y,X that are independent and uniformly distributed on $[0,\theta]$. Find the MLE
What I know:
For $Z=(X,Y)$ the likelihood function is $L(\theta)=f(\mathbf x;\theta)$. And the MLE is defined as $\Theta=\operatorname {arg\,max}_{\theta \in \Theta}L(\theta)$
The Solution only states that $\Theta=\max(x_1,x_2)$ I hope someone could explain to me how to deal with this problem. Thanks in advance
The likelihood is $\frac{1}{\theta^2}$ thus it is strictly decreasing in all its support.
But you know that
$$0\leq x\leq \theta$$
$$0\leq y\leq \theta$$
That is
$$\theta\geq \max(x,y)$$
So, as the likelihood is strictly decreasing its argmax is attained at the frontier...
Ps: the MLE is not the argmax but the argsup...try the same exercise with support $(0;\theta)$ and you will realize that the argmax does not exist