Can a linear regression be quadratic?

10.9k Views Asked by At

The following is from a comp. sci. book that discusses regression. The passage seems to say that while a function fitted to a data set may be quadratic, it may yet be considered linear. This seems contradictory, and I'm not entirely sure what I'm missing. Could someone please point me in the right direction?

For example, when we fit a quadratic, we get a model of the form $y=ax^2 + bx +c$. In such a model, the value of the dependent variable $y$ is linear in the independent variables $x^2 , x^1$ and $x^0$ and the coefficients $a, b$ and $c$.

2

There are 2 best solutions below

2
On BEST ANSWER

Linear in the coefficients $a,b,c$ not in $x$. So, any of the following models is a linear regression model

  1. $y=β_0+β_1\ln{x}+ε$,
  2. $y=β_0+β_1x+\dots β_nx^n+ε$,
  3. $y=β_0+β_1\exp(x)+β_2\cos(x)+β_3\sin(x)+ε$
  4. $\dots$

E.g. just rename $x_1:=\ln{x}$ in the first, to obtain $$y=β_0+β_1x_1+ε$$ or $x_1:=\exp(x), x_2:=\cos(x), x_3:=\sin(x)$ in the third to write it as $$y=β_0+β_1x_1+β_2x_2+β_3x_3+ε$$

Contrary the following model is not a linear regression model $$y=β_0+β_1^2x+ε$$ due to the $β^2_1$ term.

0
On

Yes. In your example, your goal is to find coefficients $a,b,$ and $c$ such that \begin{align*} y_1 &\approx a x_1^2 + b x_1 + c \\ y_2 &\approx a x_2^2 + b x_2 + c \\ y_3 &\approx a x_3^2 + b x_3 + c \\ \vdots & \\ y_N & \approx a x_N^2 + b x_N + c \end{align*} where the data points $(x_i,y_i)$ are given in advance. In other words, we want $$ y \approx M z, $$ where $$ y = \begin{bmatrix} y_1 \\ \vdots \\ y_N \end{bmatrix}, \quad M = \begin{bmatrix} x_1^2 & x_1 & 1 \\ x_2^2 & x_2 & 1 \\ \vdots & \ddots & \vdots\\ x_N^2 & x_N & 1 \end{bmatrix}, \quad z = \begin{bmatrix} a \\ b \\ c \end{bmatrix}. $$ Least squares selects $z$ by minimizing $\| Mz - y \|_2^2$. To find a minimizer, simply set the gradient equal to $0$, which yields: $$ M^T(Mz - y) = 0. $$ This is a linear system of equations for $z$ (called the "normal equations", by the way).