coefficients of the linear model $y=β_0+β_1x_1+β_2x_2$ that minimises the sum of squares error

42 Views Asked by At

Consider the following sample: $$(x_{11}, x_{12}, y_1) = (1, 3, 4), $$$$(x_{21}, x_{22}, y_2) = (2, 1, 5),$$$$ (x_{31}, x_{32}, y_3) = (3, 0, 7),$$$$ (x_{41}, x_{42}, y_4) = (4, −2, 6).$$

How can I find the coefficients of the linear model $y = β_0 + β_1x_1 + β_2x_2$ that minimises the sum of squares error using normal equations. Justifying that this solution is unique.

1

There are 1 best solutions below

0
On

The solution is defined by the formula:

$$ \begin{pmatrix} \begin{array}{c} \beta_0 \\ \beta_1 \\ \beta_2 \end{array} \end{pmatrix} = \left(\mathbf{X}^T\mathbf{X}\right)^{-1}\mathbf{X}^T\mathbf{Y}, $$

where

$$ \mathbf{Y} = \begin{pmatrix} \begin{array}{c} y_1 \\ y_2 \\ y_3 \\ y_4 \end{array} \end{pmatrix}=\begin{pmatrix} \begin{array}{c} 4 \\ 5 \\ 7 \\ 6 \end{array} \end{pmatrix} \text{ and } \mathbf{X} = \begin{pmatrix} \begin{array}{c} 1 & x_{11} & x_{12} \\ 1 & x_{21} & x_{22} \\ 1 & x_{31} & x_{32} \\ 1 & x_{41} & x_{42} \end{array} \end{pmatrix}=\begin{pmatrix} \begin{array}{r} 1 & 1 & 3 \\ 1 & 2 & 1 \\ 1 & 3 & 0 \\ 1 & 4 & -2 \end{array} \end{pmatrix}. $$

$\mathbf{X}^T$ is a transpose matrix of the matrix $\mathbf{X}$, and $\left(\mathbf{X}^T\mathbf{X}\right)^{-1}$ is an inverse matrix of the matrix product of $\mathbf{X}^T$ and $\mathbf{X}$.