I have two methods of computing a rotation matrix. One gives me the following matrix:
$$ R_1 = \begin{bmatrix} 0.29320837 & 0.89709592 & -0.33052649 \\ -0.85529364 & 0.40061608 & 0.32860241 \\ 0.42720211 & 0.18634823 & 0.88474442 \\ \end{bmatrix} $$
The other results in:
$$ R_2 = \begin{bmatrix} -0.29320837 & -0.89709592 & -0.33052649 \\ 0.85529364 & -0.40061608 & 0.32860241 \\ -0.42720211 & -0.18634823 & 0.88474442 \\ \end{bmatrix} $$
The difference is in the signs of the first two columns. Ultimately, I need to calculate the corresponding quaternion for each matrix, which end up not being the same:
$$ \begin{align} q_1 &= [0.80289614, -0.04429408, -0.23593606, -0.5456464] \\ q_2 &= [0.5456464 , -0.23593606, 0.04429408, 0.80289614] \end{align} $$
The question is: how can I enforce the signs of the first two columns so that I get consistent matrices regardless of the method I use? More specifically, I want the final outcome to follow the sign convention of $R_1$ and not $R_2$. But if I obtain a matrix following the sign convention of $R_2$, I want to detect that it's not following the sign convention of $R_1$, and make necessary changes.