How to find Cramer-Rao inequality

55 Views Asked by At

I trying to do this function. $$f(x\mid\theta)=\theta x^{\theta-1}, \;\;0<x<1,\,\,0<\theta<\infty.$$ I have finished finding my MLE.

$$L(x,\theta) = \prod_{i=1}^n \theta x_i^{\theta-1} = \theta^n \prod_{i=1}^n x_i^{\theta-1}$$ $$\log L(x,\theta) = n\log(\theta) + (\theta-1)\sum_{i=1}^n \log(x_i)$$ $$\dfrac{n}{\theta} + \sum_{i=1}^n\log(x_i) = 0$$ So that $$\hat{\theta} = \dfrac{-n}{\sum_{i=1}^n \log(x_i)}.$$ So I want everyone to help guide how to find Cramer-Rao?

1

There are 1 best solutions below

0
On

The variance of any unbiased estimator of $\theta$ is at least as big as the asymptotic variance of $\hat{\theta}$, i.e. $I(\theta)^{-1}$ (this is the Cramer-Rao bound, where $I$ is the fisher information). Here $$ I(\theta)=-E(\ell''(\theta))=n/\theta^2 $$ where $\ell=\log L$ is the log-likelihood. Hence the Cramer Rao bound is $\theta^2/n$.