Least squares in matrix form

188 Views Asked by At

So my question is eerily similar to the one asked (and answered) here: Least Squares in a Matrix Form

It is different since it deals with a specific case of LS. I get stuck when taking the matrix inverse at some point in the process.

Given is the following regression model

$$y_i =\beta_1 +\beta_2x_i + \epsilon_i, \hspace{1cm}i=1,...,n$$

In matrix notation this is: $$ \begin{bmatrix} y_1 \\ y_2 \\ y_3 \\ \vdots\\ y_n \end{bmatrix} = \begin{bmatrix} 1 & x_1\\ 1 & x_2\\ 1 & x_3\\ \vdots & \vdots\\ 1 & x_n \end{bmatrix} + \begin{bmatrix} \epsilon_1\\ \epsilon_2\\ \epsilon_3\\ \vdots\\ \epsilon_n \end{bmatrix} $$

Now taking the general least squares equation $(X^TX)^{-1}X^Ty$, I want to plug in the values of the regression to transform it into a vector of $\hat{\beta_1}$ and $\hat{\beta_2}$.

Multiplying

$$ \begin{bmatrix} 1 & x_1\\ 1 & x_2\\ 1 & x_3\\ \vdots & \vdots\\ 1 & x_n \end{bmatrix} $$

with its transpose

\begin{bmatrix} 1 & 1 & 1 & \dots & 1\\ x_1 & x_2 & x_3 & \dots & x_n \end{bmatrix}

yields

$$ \begin{bmatrix} n & \sum_{i=1}^{\infty}{x_i}\\ \sum_{i=1}^{\infty}{x_i} & \sum_{i=1}^{\infty}{x_i^2} \end{bmatrix} $$

The next step would be to take the inverse of this matrix (and then multiply it by y). Unfortunately, I don't understand how to take the inverse of these sums.