I have a set of 3D points, and to find the line of best fit through it, I've used SVD. Using Numpy's interpretation of the SVD algorithm, I'm using the first right singular vector, $v$, as the line of best fit (based on https://stackoverflow.com/questions/2298390/fitting-a-line-in-3d/2333251#2333251).
I know that the signs don't matter in solving the SVD, but I'm using the vector $v$ thereafter to calculate the angle between it and another vector (let's call it $v'$).
Tl;dr, depending on the sign/direction of the eigenvector $v$, the angle between $v$ and $v'$ changes hugely!
How can I fix this problem? I've come across this paper http://prod.sandia.gov/techlib/access-control.cgi/2007/076422.pdf but I'm struggling to understand it, and I'm finding their MATLAB code to be just as confusing. Thanks!
A line does not have a direction. If $v$ is a vector, then $L_1 = \{av : a \in \Bbb R\}$ and $L_2 = \{a(-v) : a \in \Bbb R\}$ constitute the same line in space.
If you compute the angle $\theta_1$ between $v$ and $v'$ and the angle $\theta_2$ between $-v$ and $v'$, you will find that $\theta_1 + \theta_2 = 180^\circ$. (Draw a picture!) Since $0 \le \theta_1, \theta_2 \le 180^\circ$, it must therefore be the case that one of the angles $\theta_1$ or $\theta_2$ is less than or equal to $90^\circ$. This is the most natural choice for the angle between two lines: whichever is less than or equal to $90^\circ$.