How to find the smallest variance that can be achieved by an unbiased estimator?

112 Views Asked by At

Let $X_1, \dots, X_n$ denote a random sample from the PDF

$$f_{\varphi}(x)= \begin{cases} \varphi x^{\varphi - 1} &\text{if}\, 0 < x < 1, \varphi > 0\\ 0 &\text{otherwise} \end{cases}$$

I'm trying to find the smallest variance that can be achieved by an unbiased estimator of $\varphi$. However, I've never had to find such a thing. How does one calculate this?

1

There are 1 best solutions below

0
On

First observe that your density belongs to the exponential family. This assure that the needed regular conditions to apply Cramér Rao inequality are satisfied.

Thus, applying CR inequality we have that, for any unbiased estimator $T$ for $\varphi$

$$\mathbb{V}[T]\geq \frac{1}{-n\mathbb{E}\left\{\frac{\partial^2}{\partial\varphi^2}\log f(x;\varphi) \right\}}$$

  • $\log f(x,\varphi)=\log \varphi+(\varphi-1)\log x$

  • $\frac{\partial^2}{\partial\varphi^2}\log f(x;\varphi) =-\frac{1}{\varphi^2}$

Thus

$$\mathbb{V}[T]\geq \frac{\varphi^2}{n}$$