Question:
Let $X_1,...,X_n$ be a sample with distribution
$p_{\theta}(x)=\theta x^{-2}$, for $x \geq \theta$
and $0$ for $x < \theta$ with $\theta > 0$ unknown.
Determine the maximum likelihood-estimator of $\theta$.
Answer:
We can do this with the log-likelihoodfunction which we take the derivative of and set equal to $0$ in order to find a maximum. By doing this we get
$\frac{n}{\theta} = 0$
I have no idea where to go from here since this doesn't have a solution for $\theta$.
We want to maximise this probability.
$P(X_1,X_2,...,X_n|\theta)$
A formula is used to get
$P(X_1,X_2,...,X_n|\theta)=P(X_1|\theta)P(X_2|X_1,\theta)...P(X_n|X_1,X_2,...,X_n,\theta)$
If the experiments are independent, then the last result can be simplified to
$P(X_1,X_2,...,X_n|\theta)=P(X_1|\theta)P(X_2|\theta)...P(X_n|\theta)$
Taking the logarithm, we get
$\sum_{i}logP(X_i|\theta)$
Now, there is one thing to pay attention to. If $\theta$ is chosen to be greater than even one of $X_i$, then we have a term $logP(X_i|\theta)=log0$ in the sum, according to the probability model that you defined. Therefore, we would like to have $\theta$ to be less than all $X_i$. In this case the summation can be modified as
$\sum_{i}logP(X_i|\theta)=\sum_{i}log (\theta X_i^{-2})=nlog\theta+\sum_{i}logX_i^{-2}$
If there was no limit on $\theta$, it could have gone to infinity and therefore, it had no maximum. But it was assumed that $\theta $ is lees than the minimum of $X_i$, for all $i={1,2,...,n}$. So, $\theta=min{X_i}$