Derivation of the ordinary least squares estimator β1 and the sampling distribution?

583 Views Asked by At

I am trying to derive the ordinary least squares and its sampling distribution for the model:

$$y = \beta_0 + \beta_1 x + \epsilon$$

How can I obtain the estimator for $\beta_1$

1

There are 1 best solutions below

0
On BEST ANSWER

You could solve the following optimisation problem:

$$R=\min_{\beta_0,\beta_1}\left(y_i-\hat{\beta}_0+\hat{\beta}_1x_i\right)^2$$

Now differentiating the above with respect to $\hat{\beta}_0$ and $\hat{\beta}_1$ and setting to zero i.e.

$$\frac{\partial R}{\partial \hat{\beta}_0}=\sum_{i=1}^N-2(y_i-\hat{\beta}_0-\hat{\beta}_1x_i)=0$$

and

$$\frac{\partial R}{\partial \hat{\beta}_1}=\sum_{i=1}^N-2x_i(y_i-\hat{\beta}_0-\hat{\beta}_1x_i)=0$$

By simple algebric manupulation and the fact that $\sum_i^N y_i=N\bar{y}$ we get the following:

$$\hat{\beta}_0=\bar{y}-\hat{\beta}_1\bar{x}$$

Similarly, by some manipulation you should obtain:

$$\hat{\beta}_1=\frac{\sum_{i=1}^N(x_i-\bar{x})(y_i-\bar{y})}{\sum_{i=1}^N(x_i-\bar{x})^2}$$

You could also use matrix notation instead i.e. $y,\epsilon\in\mathbb{R}^n$, $\beta\in\mathbb{R}^{k\times 1}$ and $X\in\mathbb{R}^{n\times k}$. So, you are solving essentially:

$$y=X\beta + \epsilon$$

To obtain the estimator you minimise the squared sum of errors i.e. $\epsilon'\epsilon=y'y-2\hat{\beta}'X'y+\hat{\beta}'X'X\hat{\beta}$. By doing so we obtain:

$$\hat{\beta}=(X'X)^{-1}X'y$$

From Gauss-Markov theorem (and assumptions) $\hat{\beta}$ is normally distributed with mean $\beta$ and variance $\sigma^2(X'X)^{-1}$. I will stop here. For further details please see this