Cramer-Rao lower bound for normal($\theta, 4\theta^2$)

2.2k Views Asked by At

I am trying to find the Cramer-Rao lower bound for unbiased estimators of $\theta$, given a sample $X_1,\ldots, X_n \sim \textrm{normal}(\theta,4\theta^2)$. I am calculating the CRLB as

$$ \frac{1}{-n\textrm{E}\left[\frac{\partial^2}{\partial\theta^2}\log f(x_i\vert\theta)\right]} $$

Evaluating this I get $\frac{4\theta^2}{9n}$.

For that distribution however, from the exponential family representation I think that $\left(\sum_i x_i, \sum_i x_i^2 \right)$ is a complete sufficient statistic, and therefore any estimator based only on this should achieve the CRLB. But the variance of $\bar{X}$, which is an unbiased estimator of $\theta$ based only on the above, is $\frac{4\theta^2}{n}$ which is larger.

Where have I gone wrong?

1

There are 1 best solutions below

8
On BEST ANSWER

I think there are a couple things wrong with your reasoning. First, the result you're alluding to (the Lehmann-Scheffe theorem) requires that a statistic be both sufficient and complete. This particular distribution is a curved exponential family and the statistic $T = (\sum_i X_i, \sum_i X_i^2)$ turns out not to be complete. We can see this by defining a new statistic $g(T) = n \bar{X}_n^2 / (4 + n) - S_n^2 / 4$ where $S_n^2$ is the sample variance and calculating \begin{align} \text{E}[g(T)] &= \frac{n}{4 + n} \text{E} ( \bar{X}_n^2 ) - \frac{1}{4} \text{E} ( S_n^2 ) \\ &= \frac{n}{4 + n} \left ( \text{Var}(\bar{X}_n) + \text{E}(\bar{X}_n)^2 \right ) - \theta^2 \\ &= \frac{n}{4 + n} \left ( \frac{4 \theta^2}{n} + \theta^2 \right ) - \theta^2 \\ &= 0 . \end{align} It's not difficult to verify that $g(T)$ is not degenerate at zero, and so $T$ is not a complete statistic. Secondly, even if it were the conclusion would be that any unbiased function of this statistic would be a UMVUE, which does not guarantee that it achieves the Cramer-Rao lower bound.