I'm solving the following exercise from previous final exam:
For $\alpha \in (0,1)$, we define the following density function $$f (x; \alpha) = \alpha x^{\alpha -1} \mathbf{1}_{[0,1]} (x)$$
We consider a sample $(X_1, \ldots, X_n)$ of i.i.d. random variables with density $f$. Prove that the maximum likelihood estimator $\hat\alpha_n$ of $\alpha$ converges to $\alpha$ almost surely as $n \to \infty$.
After writing down the log likelihood function, I use FOC and SOC to get $$\hat\alpha_n = \frac{-n}{\ln (\prod_{i=1}^n X_i)}$$
Then I'm stuck at proving $\hat\alpha_n$ converges to $\alpha$ almost surely. How can I proceed to finish the proof? Many thanks!
We have $\mathbb E [\ln X_1 ] = -1/\alpha$. By SLLN, $\frac{1}{n}\sum_{i=1}^n\ln (X_i)$ converges almost surely to $-1/\alpha$. Then by continuous mapping theorem, we get $\hat\alpha_n$ converges almost surely to $\alpha$.