Maximum Likehood Estimator for $Beta(\frac{1}{\theta},1)$

49 Views Asked by At

I've got the following $X_1, \dots, X_n$ random sample of the following $Beta(\frac{1}{\theta},1)$ distribution: $$f(x; \theta) = \frac{1}{\theta}x^{\frac{1-\theta}{\theta}} \;; 0<x<1 \;;\theta>0$$

I've got to find the estimator $\bar{\theta}$ using the maximum likehood method, so

$L(\theta) = \prod_{i=1}^n \frac{1}{\theta}x_i^{\frac{1-\theta}{\theta}}$

$\implies l := log(L(\theta)) = log(\prod_{i=1}^n \frac{1}{\theta}x_i^{\frac{1-\theta}{\theta}}) = -nlog(\theta) + \sum_{i = 1}^n\frac{1-\theta}{\theta}log(x_i)$

$\implies \frac{dl}{d\theta}= -\frac{n}{\theta} + \sum_{i=1}^n\frac{-log(x_i)}{\theta^2}$

$\frac{dl}{d\theta} = 0 \iff \frac{n}{\theta} = \frac{-1}{\theta^2}\sum_{i=1}^nlog(x_i) \iff \theta n = -\sum_{i=1}^nlog(x_i) $

$\implies \bar{\theta} = \frac{-\sum_{i=1}^nlog(x_1)}{n}$

Since then I think that everything is alright, now I have to check that $\bar{\theta}$ is maximum from the second derivative, but I don't really get how.

1

There are 1 best solutions below

0
On

Start with $$\frac{\partial l}{\partial \theta}= -\frac{n}{\theta} + \sum_{i=1}^n\frac{-\log(x_i)}{\theta^2}.$$ Differentiate again: $$\frac{\partial^2 l}{\partial \theta^2}= \frac{n}{\theta^2} + \sum_{i=1}^n\frac{2\log(x_i)}{\theta^3}.$$ Substitute $\overline\theta=\frac{-\sum_{i=1}^n \log(x_i)}{n}$ instead of $\theta$: $$\frac{\partial^2 l}{\partial \theta^2}\Bigg|_{\theta=\overline\theta}= -\frac{n^3}{\left(\sum_{i=1}^n\log(x_i)\right)^2}<0.$$ Since the second derivative in the given extreme point $\overline \theta$ is negative, $\overline \theta$ provides maximum.