What makes a linear regression model a linear combination of the input variables?

1.6k Views Asked by At

Having learned about linear combinations in linear algebra, and having moved on to statistics, I've come across the idea that linear regression models are linear combinations of input variables.

Firstly, in linear algebra, I would define a linear combination like this:

$$a_1 \vec v_1 +a_2 \vec v_2 +...+a_n \vec v_n$$

Where $a_1,...,a_n$ are scalars, and $\vec v_1,..., \vec v_n$ are vectors.

Suppose we have some simple linear regression model like this:

$$y(\vec x, \vec w) = w_0+w_1x_1+...+w_n x_n$$

Where $\vec x=(x_1,...,x_n)^T$.

Then I'm told $w_0+w_1x_1+...+w_n x_n$ is a linear combination of the input variables. My mind is trained to think of linear combinations in terms of the addition of vectors with scalar coefficients, but I don't see vectors being combined in this simple linear regression model. So is this a linear combination? Maybe it isn't, and we need to define the model some other way to make it a linear combination? Maybe we can think of each $x_i$ as being a 1-dimensional vector and the linear combination of these moves us up/down a real-valued number line? I'm not sure. I'd like some perspective on this in order to understand where the linear combination of input variables is coming from.