Proving the uniqueness of a minimum variance unbiased estimator

187 Views Asked by At

Theorem: Suppose that both $T_1$ and $T_2$ are minimum variance unbiased estimators of the function $h$ of a parameter of interest $\theta$ ($\theta \in \Theta$).

Then $\forall \theta \in \Theta:T_1 = T_2$ with probability 1.

Proof: As both $T_1$ and $T_2$ are unbiased estimators of $h(\theta)$ it holds that $$E_{\theta}T_1 = E_{\theta}T_2 = h(\theta)$$

Note: the subscript $\theta$ indicates the fact that we are considering a family of probability density functions rather than a single pdf.

Furthermore, as a result of both estimators being minimum variance estimators denoting variance by $D^{2}_{\theta}$ we can write that

$$D_{\theta}T_1 = D_{\theta}T_2$$.

Therefore

$$D^{2}_{\theta}(T_1) \leq D^{2}_{\theta}\left(\frac{T_1+T_2}{2}\right) = \frac{D^{2}_{\theta}\left(T_1\right) + 2Cov \left(T_1, T_2\right) + D^{2}_{\theta}\left(T_2\right)}{4} = \frac{D^{2}_{\theta}\left(T_1\right)+ Cov \left(T_1, T_2\right)}{2}$$ =>

In case we denote the correlation coefficient by $R\left(T_1, T_2\right)$

$$D^{2}_{\theta}\left(T_1\right) \leq Cov\left(T_1, T_2\right) = D_{\theta}\left(T_1\right)D_{\theta}\left(T_2\right)R(T_1, T_2) = D^{2}_{\theta}\left(T_1\right)R(T_1, T_2)$$

We know that the following correlation-inequality holds:

$$-1 \leq R\left(T_1, T_2\right) \leq 1$$

Hence

$$D^{2}_{\theta}\left(T_1\right)R(T_1, T_2) \leq D^{2}_{\theta}\left(T_1\right)$$

The implication is that

$$D^{2}_{\theta}\left(T_1\right) = D^{2}_{\theta}\left(T_2\right) = Cov\left(T_1, T_2\right)$$.

Our final flow of thought:

$$D^{2}_{\theta}\left(T_1 - T_2 \right) = D^{2}_{\theta}\left(T_1\right) - 2Cov\left(T_1, T_2\right) + D^{2}_{\theta}\left(T_2\right) = 0$$

We arrive at the conclusion:

$$\forall \theta \in \Theta : P_{\theta}\left(T_1 = T_2\right) = 1$$.

Question: How does the first inequality of the proof arise?

1

There are 1 best solutions below

0
On

The justification of the first inequality is the following:

We construct a new estimator using both $T_1$ and $T_2$: $T^{'}:=\frac{T_1 + T_2}{2}$.

We know that $T_1$ is a minimum variance estimator therefore its variance cannot be larger than the variance of $T^{'}$.