Minimum of variance when sample is unbiased?

396 Views Asked by At

Show that if an estimator $\hat\mu=a_1X_1 +a_2X_2 +\cdots+a_nX_n$, where $a_1, a_2,\ldots,a_n$ are constants, is unbiased, then its variance is minimum when $a_1=a_2=\cdots=a_n=\frac{1}{n} \hat\mu=\bar X$.

Ive tried subjecting it to $\sum a_i=1$, and I know that $\sum a_i^2$ is minimized by choosing $a_1=a_2=\cdots=a_n=1/n$), not sure what to do next.

We are assuming all observations are iid.

2

There are 2 best solutions below

1
On

This statement is not generally true. For instance, if your $X_i$'s are dependent or not identically distributed (or, more technically, do not have finite second moments) then it is false.

If your observations are iid with finite variance, which is probably assumed, then the answer you want can be found by solving

$$\max_{\sum_{i=1}^n a_i=1} V\biggl(\sum_{i=1}^n a_i X_i\biggr),$$

using the familiar rules

$$V(cX)=c^2V(X)$$

and,

$$\text{$X,Y$ independent with finite variances} \Rightarrow V(X+Y)=V(X)+V(Y).$$

0
On

$\newcommand{\var}{\mathbb{var}}$

$$ \begin{align} \var(a_1 X_1+\cdots+a_nX_n) & = a_1^2\var(X_1)+\cdots+a_n^2\var(X_n) \\[8pt] & = a_1^2\sigma^2+\cdots+a_n^2\sigma^2 \\[8pt] & = (a_1^2+\cdots+a_n^2)\sigma^2. \end{align} $$

So it's just the problem of minimizing $a_1^2+\cdots+a_n^2$ subject to $a_1+\cdots+a_n=1$.

All this assumes that all variances are equal and all covariances are $0$.

If the covariances are $0$ but the variances differ, then the smallest variance among all weighted averages is attained when the weights are proportional to the reciprocals of the variances.