Determining the Cramer-Rao lower bound

296 Views Asked by At

Let $X = (X_1,\dots,X_n)$ be a vector of iid variables from the smooth density $f(x,\theta_0), \theta_0 \in \Theta \subset \mathbb{R}$. Let $L(\theta)$ be the likelihood and $I(\theta)$ the information of $X$ for $\theta$.

For $f(x;\sigma^2) = \frac{x}{\sigma^2} \exp\left\{ - \frac{x^2}{2\sigma^2} \right\}, x \geq 0$, determine the Cramer-Rao lower bound for $\sigma$ and $\sigma^2$ and demonstrate whether or not it is attainable in these two cases.

As part of the problem, I managed to show that an unbiased estimator $\hat{\theta}$ of $\theta$ achieves the Cramer-Rao inequality iff $\frac{\partial \log L(\theta)}{\partial \theta} = I(\theta)\left(\hat{\theta}-\theta\right)$.

I was able to get a lower bound by taking the partials of $\log f(x;\sigma^2)$; this procedure gives $I(\sigma^2)$, right? How do I calculate $I(\sigma)$? I feel that I'm missing something easy as I'm not sure how to get $f(x;\sigma)$ from $f(x;\sigma^2)$...