So, let's say we want the projection of vector v [2, 1] on to L (line L) L = C[2, 1]), where vector x is a vector that lies along the Line.
I know that one way to figure out the projection would be C*vectorX (C representing some scaled version of vector x).
Then, a second way to figure out the projection would be to normalize vector x. then C = (vector v * vector x) * vector x (or vector x as vector u (normalized).
But the final way to figure it out would be to see the projection as a linear transformation. Projection is Matrix A * vector v. For some reason when expressed as a Matrix / Vector product, it is less clear/intuitive for me.
The key point of confusion: I'm not sure for example how we get from a normalized version of vector x to expressing it as a Matrix (let's say Matrix A). Where then we would simply multiply it by vector v.
Hopefully this is clear.
A fact about projection matrices i hope you would find interesting is that they are always positive semi-definite. All eigenvalues are all ones (for a vector that is in column space) or zeros (for a vector that is not in column space). I am very new here. Sorry for my answer being improper.
I got your question better after reading it again. For $$P = A(A^T A)^{-1}A$$. $$ (A^TA)$$ does the weighting job here. If A were an orthogonal matrix it would be the identity. You know that for a vector v,$$v^Tv$$ is the square of the 2-norm of the vector. This could be the relation you are looking for.