ML & MM Estimators Problem

237 Views Asked by At

Suppose $X_1,\ldots, X_n$ are $\text{iid}$ random variables each with probability density function $f(x) = θx^{θ−1}$ where $0 < x < 1$, $θ > 0.$

a) Show that $\ln L(θ) = n \ln θ + (θ − 1)\sum \ln x_i$, and hence find the maximum likelihood estimator for $θ$.

b) Show that $E(X) = \dfrac \theta {\theta + 1}$ , and hence find the method of moments estimator for θ.

c) Are the ML and MM estimators the same?


I know how to get $E(X)$. How to tackle a) and b) the method of moments estimator for $θ$? Thanks!


Last EDIT: Solved.

1

There are 1 best solutions below

2
On

$$\operatorname{E}(X) = \dfrac\theta{\theta+1}$$

$$(\theta+1)\operatorname{E}(X) = \theta,$$

$$\theta\operatorname{E}(X) + \operatorname{E}(X) = \theta$$

$$\theta\operatorname{E}(X) - \theta = \operatorname{E}(X)$$

$$\theta (\operatorname{E}(X)-1) = \operatorname{E}(X)$$

$$ \theta = \frac{\operatorname{E}(X)}{\operatorname{E}(X) -1 }$$

If you had trouble with that part, it seems to be more a matter of algebra than anything else. To get the method-of-moments estimator from that, just put the sample average in place of the population average $\operatorname{E}(X).$

It's not clear what your question is about part (a). Are you wondering how they concluded $L(\theta)$ is what it is? In fact, the function you've shown is not what is usually denoted $L(\theta),$ but rather you've got the function usually denoted as $\ell(\theta),$ which is $\ln L(\theta).$ Or are you wondering how to get from that function to the MLE?