Find Least Squares Regression Line

915 Views Asked by At

I have a problem where I need to find the least squares regression line. I have found $\beta_0$ and $\beta_1$ in the following equation

$$y = \beta_0 + \beta_1 \cdot x + \epsilon$$

So I have both the vectors $y$ and $x$.

I know that $\hat{y}$ the vector predictor of $y$ is $x \cdot \beta$ and that the residual vector is $\epsilon = y - \hat{y}$.

I know also that the least squares regression line looks something like this $$\hat{y} = a + b \cdot x$$ and that what I need to find is $a$ and $b$, but I don't know exactly how to do it. Currently I am using Matlab, and I need to do it in Matlab. Any idea how should I proceed, based on the fact that I am using Matlab?

Correct me if I did/said something wrong anyway.

2

There are 2 best solutions below

8
On BEST ANSWER

First define

X = [ones(size(x)) x];

then type

regress(y,X)

Observations:

  • the first step is to include a constant in the regression (otherwise you would be imposing $a=0$).

  • the output will be a vector with the OLS estimates $(a,b)$.

0
On

Sequence of $m$ measurements: $$\left\{ x_{k}, y_{k} \right\}_{k=1}^{m}$$ Model: $$ y(x) = \beta_{0} + \beta_{1} x $$ Linear system: $$ \begin{align} % \mathbf{A} \, \beta &= y \\ % A \left[ \begin{array}{cc} 1 & x_{1} \\ 1 & x_{2} \\ \vdots & \vdots \\ 1 & x_{m} \end{array} \right] % beta \left[ \begin{array}{cc} \beta_{1} \\ \beta_{2} \end{array} \right] % &= \left[ \begin{array}{c} y_{1} \\ y_{2} \\ \vdots \\ y_{m} \end{array} \right] % \end{align} $$ Least squares solution: $$ \beta_{LS} = \left\{ \beta \in \mathbb{C}^{2} \colon \lVert \mathbf{A} \,x - y \rVert_{2}^{2} \text{ is minimized} \right\} $$ Solution type: we have full column rank. Solution is unique - a point.


Solution method 1: Normal equations $$ \begin{align} % \mathbf{A}^{*} \,\mathbf{A} \, \beta &= \mathbf{A}^{*} \,y \\ % A \left[ \begin{array}{cc} \mathbf{1} \cdot \mathbf{1} & \mathbf{1} \cdot x \\ x \cdot \mathbf{1} & x \cdot x \end{array} \right] % beta \left[ \begin{array}{cc} \beta_{1} \\ \beta_{2} \end{array} \right] % &= \left[ \begin{array}{c} \mathbf{1} \cdot y \\ x \cdot y \end{array} \right] % \end{align} $$ $$ \Downarrow $$ $$ \begin{align} % \beta &= \left( \mathbf{A}^{*} \, \mathbf{A} \right)^{-1} \mathbf{A}^{*} \,y \\ % beta \left[ \begin{array}{cc} \beta_{1} \\ \beta_{2} \end{array} \right] % &= % inv \left( \left( \mathbf{1} \cdot \mathbf{1} \right) \left( x \cdot x \right) - \left( \mathbf{1} \cdot x \right)^{2} \right)^{-1} \left[ \begin{array}{rr} x \cdot x & -\mathbf{1} \cdot x \\ -\mathbf{1} \cdot x & \mathbf{1} \cdot \mathbf{1} \end{array} \right] % \left[ \begin{array}{c} \mathbf{1} \cdot y \\ x \cdot y \end{array} \right] % \end{align} $$

Solution method 2: Moore-Penrose pseudoinverse: $$ \beta = \mathbf{A}^{+} y $$


The MATLAB intrinsic mldivide is one option.