Likelihood Cramér-Rao Bound.

168 Views Asked by At

How can I show the following necessary and sufficient condition?

An unbiased estimator $ \hat{\theta} $ of $ \theta $ achieves the Cramér-Rao Lower Bound if and only if $$ \frac{\partial \log(L(\theta))}{\partial \theta} = I(\theta) \cdot (\theta - \hat{\theta}), $$ where $ I(\theta) $ and $ L(\theta) $ denote respectively the information and likelihood functions of a sample $ (X_{1},X_{2},\ldots,X_{n}) $ of i.i.d. random variables having a smooth pdf.

The ‘$ \Longrightarrow $’ implication is clear, but I don’t know how to prove the ‘$ \Longleftarrow $’ implication.

Thanks a lot!

1

There are 1 best solutions below

1
On

If θ^ reaches the Frechet-Cramer-Rao lower boundary it means it's efficient of a parametric function h(θ). If it is efficient, Var[T] reaches the mentioned bound, and therefore d/dθ ln(L(θ))= l(θ)*(T-E[T]).

It is a double way implication, really. Just remember that l(θ)=I(θ) when θ^ is unbiased of θ.