Compare the variance of two unbiased estimators

576 Views Asked by At

Two unbiased estimators of $Y_i\sim N(\beta x_i, \sigma^2)$ with $\sigma$ known are $\tilde\beta=\dfrac{S_{xy}}{S_{xx}}=\dfrac{\sum_{i=1}^n(x_i-\overline x)Y_i}{\sum_{i=1}^n(x_i-\overline x)x_i}$ and $\tilde{\tilde\beta}=\dfrac{\overline Y}{\overline x}$. How can I show $\operatorname{Var}(\tilde B)\le \text{Var}(\tilde{\tilde\beta})$?

I have gotten as far as determining $\text{Var}(\tilde\beta)=\dfrac{\sigma^2}{\sum_{i=1}^n(x_i-\overline x)^2}=\dfrac{\sigma^2}{\sum_{i=1}^n(x_i-\overline x)x_i}$ and $\operatorname{Var}(\tilde{\tilde\beta})=\dfrac{\sigma^2}{n\cdot\overline x^2}=\dfrac{n\sigma^2}{\left(\sum_{i=1}^nx_i\right)^2}$ but am unsure how to make a proper comparison.

1

There are 1 best solutions below

2
On BEST ANSWER

$\newcommand{\v}{\operatorname{var}}\newcommand{\c}{\operatorname{cov}}$If you had $$Y_i \sim N(\alpha+ \beta x_i, \sigma^2) \tag 1$$ and the $Y$s are independent, then the least-squares estimator of $\beta$ is $$\widehat\beta = \dfrac{\sum_{i=1}^n (x_i - \overline x)Y_i}{\sum_{i=1}^n (x_i-\overline x)x_i}.$$ But you have $$Y_i\sim N(\beta x_i,\sigma^2).\tag 2$$ You didn't mention independence but I will assume that was intended. In this situation, the least-squares estimator of $\beta$ is $$ \widehat\beta = \frac{\sum_{i=1}^n x_i Y_i}{\sum_{i=1}^n x_i^2}. \tag 3 $$ Under $(2),$ the random variable $$ \tilde{\tilde\beta} = \frac{\overline Y}{\overline x} \tag 4 $$ is also an unbiased estimator of $\beta.$

I will take the question to be: How can we show that $\v\left( \widehat\beta \right) \le \v\left( \tilde{\tilde\beta} \right),$ with $\widehat\beta$ defined as in line $(3)$ above. \begin{align} & \v\left(\tilde{\tilde\beta}\right) \\[10pt] = {} & \v\left( \left( \widehat\beta \right) + \left( \tilde{\tilde\beta} - \widehat\beta \right) \right) \\[10pt] = {} & \v\left( \widehat\beta \right) + \v\left( \tilde{\tilde\beta} - \widehat\beta \right) + 2\c\left( \widehat\beta, \left( \tilde{\tilde\beta} - \widehat\beta \right) \right) \\[10pt] = {} & \v\left( \widehat\beta \right) + \v\left( \tilde{\tilde\beta} - \widehat\beta \right) + 0 \qquad \text{(See below for justification of this line.)} \\[10pt] \ge {} & \v\left( \widehat\beta\right). \end{align} It remains to show that the covariance appearing above is zero. \begin{align} \c\left( \widehat\beta, \widehat\beta - \tilde{\tilde\beta} \right) & = \c\left( \frac{\sum_{j=1}^n x_j Y_j}{\sum_{i=1}^n x_i^2}, \frac{\sum_{k=1}^n x_k Y_k}{\sum_{i=1}^n x_i^2} - \frac{\overline Y}{\overline x} \right) \\[10pt] & = \frac 1 {\left( \sum_{i=1}^n x_i^2 \right)^2} \sum_{\ell=1}^n x_\ell^2 \v(Y_\ell) - \frac 1 {\left( \sum_{i=1}^n x_i^2 \right) \overline x} \sum_{j=1}^n x_j\c(Y_j, \overline Y) \\ & \qquad \text{In the first sum on the line above,} \\ & \qquad \text{all terms in which $j\ne k$ have vanished.} \\[10pt] & = \frac 1 {\left( \sum_{i=1}^n x_i^2 \right)^2} \sum_{\ell=1}^n x_\ell^2 \sigma^2 - \frac 1 {\left( \sum_{i=1}^n x_i^2 \right) \overline x} \sum_{j=1}^n x_j \frac{\sigma^2} n \end{align} and then trivial cancelations do the rest. Specifically, $\sigma^2$ is a common factor, and $\sum_{\ell=1}^n x_\ell^2$ cancels a sum of squares in the denominator, and then $\sum_{j=1}^n x_j/n$ cancels $\overline x$ in the denominator.

NOTE: All of the above can be proved with weaker assumptions, as follows:

  • Suppose $\operatorname E(Y_i) = \beta x_i$ and $\v(Y_i)= \sigma^2,$ for $i=1,\ldots, n,$ but do not assume these are normally distributed;
  • We may drop any assumption that the $Y_i - \operatorname E(Y_i)$ all have the same distribution;
  • Suppose $\c(Y_j,Y_k) = 0$ for $j\ne k$ but do not assume independence.

Everything in the argument above still works.