From what I understand about least squares systems, is that the line of best fit for a vector $Y$ is usually taken to be the projection of this vector onto the column space of $X$, for a model $Y=X\beta$. If there exists a solution, ie, $n>p$, the number of equations outnumber the parameters, and I take $n=3$ and $p=2$, then I am projecting a three-dimension vector into a two-dimensional space. In this case, we have a perfect line going straight down into the column space of $X$, as the picture shows:
However, I was wondering what happens when say $n=2 < p=3$? In other words, in a case with two equations and three variables, is this synonymous with trying to project a two-dimensional object into a three dimensional space and trying to minimize this projection length? Is this why there exist multiple solutions? Thanks.

If p is the number of parameters, and t (I don't like n here :P) the number of observations. Let X be the regression matrix (t rows and p columns). Now imagine X has full rank t, where t
Like posted in the comments however, the much more interesting case arises
Now, if your matrix X has a rank lower then the number of parameters, this means, that some combination of your parameters cannot be inferred from the given data. Thus leaving a subspace as solution. Maybe best answering your question: it is the question of finding the hyperplane, that best projects the vector into the subspace spanned by X. Roughly speaking you fix everything in the subspace you know something about (span of X) and declare, that everything else is yet to be determined.
the same in the sense, that they are to be constructed as combinations of the already given ones.