I've been reviewing some concepts in Linear Algebra, specifically watching the Essence of Linear Algebra by 3blue1brown on YouTube.
As I was watching the Dot Product video I realized that the Dot product is form of Dimensionality reduction/compression (specifically the portion of the video where he shows the the projection of 2D points to the 1D line).
I'm curious as to if this line of thinking makes sense and more specifically what does this means in higher order dimensions. For example the dot product of two 100 dimension vectors produces one value which seems to imply a lot of data is encoded in this one number
Edit: Added link to video (time stamp 4:11)
Your view is correct to a good extent. When two vectors become similar their cosine product (dot) is the highest (1). It means that instead of two one of them is enough to explain the variation. Among all such vectors, there will be vectors of features ( with elements representing its values for respective instance for that given dimension) which could stand closest with the maximum number of such vectors can be chosen as the vector representing rest of the closer allies vectors. This is the rudimentary idea but same is used by most of the feature reduction classical approaches with their own refinements.