How to Cramer-Rao inequality?

124 Views Asked by At

I am trying to do this function: $$f(x\mid\theta)=\theta x^{\theta-1}, \;\;0<x<1,\,\,0<\theta<\infty.$$ I have finished finding my MLE.

$$L(x,\theta) = \prod_{i=1}^n \theta x_i^{\theta-1} = \theta^n \prod_{i=1}^n x_i^{\theta-1}$$ $$\log L(x,\theta) = n\log(\theta) + (\theta-1)\sum_{i=1}^n \log(x_i)$$ $$\dfrac{n}{\theta} + \sum_{i=1}^n\log(x_i) = 0$$

So that $$\hat{\theta} = \dfrac{-n}{\sum_{i=1}^n \log(x_i)}.$$

Please show me how to find Cramer-Rao.

1

There are 1 best solutions below

2
On BEST ANSWER

Your likelihood function is $$ L(x,\theta) = \prod_{i=1}^N \theta x_i^{\theta-1} = \theta^N \prod_{i=1}^N x_i^{\theta-1} $$ and the log likelihood is $$ \log L(x,\theta) = N\log(\theta) +\sum_{i=1}^N (\theta-1) \log(x_i) $$ and your MLE solves the first-order condition for the log likelihood, $$ \dfrac{N}{\theta} + \sum_{i=1}^N\log(x_i) = 0 $$ or $$ \hat{\theta} = \dfrac{\sum_{i=1}^N \log(x_i)}{N}. $$

Now, the information matrix is $$ I(\theta_0) = -N \mathbb{E}_x[D_\theta^2 \log L(x,\theta_0)] = -N \mathbb{E}_x[-N/\theta_0^2] = N^2/\theta_0^2. $$

The cramer-rao inequality is that $$ \text{var}(\hat{\theta}) \ge \dfrac{1}{I(\theta_0)} = \dfrac{\theta_0^2}{N^2}, $$ so the variance of your estimator is bounded by the inverse of the information matrix.