least square estimators and simple linear regression related proofs

159 Views Asked by At

Suppose that we have independent samples ${(x_i,y_i ):i=1,⋯,n}$ which are assumed to follow $ y_i=β_0+β_1 x_i+ε_i $ where $\epsilon_i$ are i.i.d. from $N(0,\sigma^2)$ . Suppose that $b_0$ and $b_1$ are the least square estimators of $b_0$ and $b_1$ respectively. Define $y ̂_i=b_0+b_1 x_i$ and $e_i=y_i-y ̂_i$.

Prove each of the following: enter image description here

where $s_x^2$ and $s_y^2$ are the sample variances of ${x_i:i=1,⋯,n}$ and ${y_i:i=1,⋯,n}$ respectively, and $s_xy$ is their sample covariance.

I know that this will be a useful fact but I could not figure it out yet $$y ̂_i=b_0+b_1 x_i=y ̅+b_1 (x_i-x ̅)$$

Any advice will be greatly appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER
  1. \begin{align} SSreg &= \sum ( \hat y - \bar{y} ) ^ 2\\ & = \sum ( b_0 + b_1 x_i - b_0 - b_1 \bar x ) ^ 2\\ & = b_1^2\sum ( x_i - \bar x ) ^ 2\\ \end{align} and now just replace $b_1$ with its explicit form, which is $$ \frac{\sum(y_i - \bar y)(x_i - \bar x)}{\sum( x_i - \bar x )^2}. $$

  2. Use the results from (1) and note that $b_1 = \frac{S_{xy}}{S_x^2}$, thus

\begin{align} R^2 &= SSreg/SST\\ & = \frac{b_1 ^ 2 S_{x}^2}{ S_y^2}\\ & = \frac{S_{xy}^2 S_{x}^2}{ S_x^4S_y^2}\\ & = \frac{S_{xy}^2 }{ S_x^2S_y^2} \end{align}