In least-squares, say we have $n$-points in 2-D space. Now, assume these points don't lie on a line(2-D hyperplane). Do we find $n$-dimensional hyperplane on which all these points lie?
If yes, then say I want a line(2-D hyperplane) which is a least-squares approximation for the $n$ points. Is that 2-D hyperplane a projection of the n-dimensional hyperplane we found earlier.
EDIT: New Questions after reading the answer.
Say, after solving I obtained the b and m. Questions:
What the following vector subspace represent? $$m \begin{bmatrix} x_1 \\ x_2 \\ \vdots\\ x_n \end{bmatrix} +b\begin{bmatrix} 1 \\ 1 \\ \vdots \\ y_n\end{bmatrix}$$
How the does the line in $R^2$ ,i.e., $y=mx+b$, relate to above columnspace?
Assuming you're trying to do linear regression for $n$ points in $\Bbb R^2$, you are seeking a line of the form $y=mx+b$ that is the "best fit." This is often set up as a standard calculus question, minimizing the square-error. As a linear algebra problem, you consider the system of linear equations $mx_i + b = y_i$, $i=1,\dots,n$, and set this up as a matrix problem: $$\begin{bmatrix} x_1 & 1 \\ x_2 & 1 \\ \vdots & \vdots \\ x_n & 1\end{bmatrix}\begin{bmatrix} m \\ b \end{bmatrix} = \begin{bmatrix} y_1 \\ y_2 \\ \vdots \\ y_n\end{bmatrix}, \quad A\mathbf x = \mathbf y, \text{ for short}.$$ Assuming the $n$ data points do not lie on a line, this linear system is inconsistent. We find the least squares solution by projecting the vector $\mathbf y$ onto the column space (image) of $A$. So this is, in essence, doing a projection in $\Bbb R^n$ onto a $2$-dimensional linear subspace.
But there are no $n$-dimensional hyperplanes in this story when your points live in $\Bbb R^2$.
If you try to do quadratic (or higher) regression, then you are looking at $3$-dimensional (and higher) linear subspaces of $\Bbb R^n$ given by the column spaces of matrices with more columns.