Determine Maximum-Likelihood-Estimator without knowing the observed value

55 Views Asked by At

Consider the following distribution with Lebesgue density $$f_{\theta}(x) = \begin{cases} \theta x^{-\theta-1}, \quad \text{if } 1 < x < \infty \\ 0, \quad \text{else} \end{cases},$$ for $\theta \in [a,b]$ with $1 < a < b < \infty$. Let $r > 1$ be a real number and the observed value $x_1$ be greater than $r$. However, the value of $x_1$ is unknown. Determine the MLE for $\theta$.

As we don't know the exact value of $x_1$ the classic approach cannot be done, so I determined the probability of a random variable (with the distribution above) being greater than r and that is $r^{-\theta}$. If we consider this to be the Likelihood function, then the MLE for $\theta$ is going to be the minimum value of the interval for $\theta$ and that means $a$, as $r^{-\theta}$ is decreasing in $\theta$.

Is this correct or is there another approach which has to be considered?

1

There are 1 best solutions below

2
On

Your log likelihood is (unless an additive constant)

$$l(\theta)=\log\theta-\theta\log x_1$$

with MLE

$$\hat{\theta}=\frac{1}{\log x_1}$$

but being $\theta\in[a;b]$ the mle becomes

$$\hat{\theta}=\max\left\{a;\min\left\{\frac{1}{\log x_1};b\right\}\right\}$$

Being $x_1$ unknown but $x_1>r$ you mle estimate is

$$\hat{\theta}=\max\left\{a;\min\left\{\frac{1}{\log r};b\right\}\right\}$$