Difference between MVB and UMVU estimators

885 Views Asked by At

I am trying to understand the difference between the UMVUE (uniformly minimum-variance unbiased estimator, also known as minimum-variance unbiased estimator (MVUE)) and the MVBE (minimum variance bound estimator).

There seems to be a lot of writing on the UMVUE, but not so much on the MVBE. What I have found that discusses this exact topic:

  • This, which seems to indicate that a MVBE would also be a UMVUE (as the variance of a MVBE is smaller than the UMVUE).
  • And this (see page 15), which also says that a MVBE is again the UMVUE.

However, I'm still unsure of the fundamental difference between the two.

  • The MVBE is unbiased and attains (meaning it equals) the lower bound of the Cramer-Rao inequality (again from page 15 of that second source)
  • "an unbiased estimator which achieves this [Cramer-Rao] lower bound is said to be (fully) efficient. Such a solution achieves the lowest possible mean squared error among all unbiased methods, and is therefore the [UMVUE]" (source).

Are these not both the same thing?

1

There are 1 best solutions below

2
On

I take MVBE to mean an unbiased estimator whose variance attains the Cramer-Rao bound.

Therefore, if an MVBE exists, it is always the UMVUE. But the converse is not true because variance of UMVUE does not necessarily attain the Cramer-Rao bound.

For a concrete example, consider $X_1,X_2,\ldots,X_n$ i.i.d Exponential with rate $\theta$. The joint pdf is

$$f_{\theta}(\boldsymbol x)=\theta^n \exp\left(-\theta\sum_{i=1}^n x_i\right)\mathbf1_{x_1,\ldots,x_n>0} \quad,\,\theta>0$$

Therefore, $$\frac{\partial}{\partial\theta}\ln f_{\theta}(\boldsymbol x)=\frac{n}{\theta}-\sum_{i=1}^n x_i=-n\left(\overline x_n - \frac1{\theta}\right) \tag{$\star$}$$

Now $(\star)$ is in the form of the equality condition of Cramer-Rao inequality, so variance of the sample mean $\overline X_n$ attains the Cramer-Rao lower bound for $1/\theta$. Moreover, $\overline X_n$ is unbiased for $1/\theta$. Therefore, $\overline X_n$ is an MVBE as well as the UMVUE of $1/\theta$.

But there is no MVBE of $\theta$, because only functions of the form $k/\theta$ admit estimators whose variance attains the Cramer-Rao bound. This is clear from $(\star)$. The UMVUE of $\theta$ exists regardless, and is given by $\hat\theta=\frac{n-1}{\sum_{i=1}^n X_i}$ for $n>1$. As an exercise, using the distribution of $\sum_{i=1}^n X_i$, one can show that variance of $\hat\theta$ exceeds the Cramer-Rao bound for $\theta$:

$$\operatorname{Var}_{\theta}(\hat\theta)=\frac{\theta^2}{n-2}>\frac{\theta^2}{n}=\text{CRLB}(\theta)\quad,\,n>2$$