Do signs matter in SVD?

1.7k Views Asked by At

I have written an algorithm to compute the SVD of a 2x2 matrix. I was checking against a Mathematica query, and I noticed that the signs in the $U$ and $V$ matrices do not match those from my algorithm. Does this matter, or is it still a valid decomposition?

I plan to use the SVD to compute the polar decomposition as well. Do the SVD signs matter for use in the polar decomposition?

Here is an outline of the algorithm:

//Compute A^T*A
a = A(0,0)*A(0,0) + A(1,0)*A(1,0)
b = A(0,1)*A(0,1) + A(1,1)*A(1,1)
c = A(0,0)*A(0,1) + A(1,0)*A(1,1)

//Compute eigenvalues
root = sqrt((a-b)*(a-b) + 4*c*c)
eig1 = (a+b+root)/2
eig2 = (a+b-root)/2

//Compute singular values for "Sigma" matrix
s1 = sqrt(eig1)
s2 = sqrt(eig2)

//Compute eigenvectors for "V" matrix
eigvec1 = (c, eig1-a)
eigvec1.normalize()
eigvec2 = (-eigvec1(1), eigvec1(0))

//Compute columns of "U" matrix
col1 = (
    (A(0,0)*eigvec1(0) + A(0,1)*eigvec1(1)) / s1,
    (A(1,0)*eigvec1(0) + A(1,1)*eigvec1(1)) / s1,
)
col2 = (-col1(1), col1(0))

Edit:

I just found out that this algorithm won't work with the matrix: $\begin{bmatrix}1 & 1 \\ 1 & 0\end{bmatrix}$. The $U$ and $V$ matrices are correct, except the signs. How do I correct them to give proper signs?

1

There are 1 best solutions below

4
On

It is a bad idea to compute the SVD of a matrix by forming $A^tA$ and $AA^t$. Numerical analysts normally apply orthogonal transformations to make the $A$ matrix bidiagonal. The SVD of the bidiagonal matrix is computed using standard algorithms such as the QR-algorithm, divide & conquer or bisection. Please see a standard textbook such as "Matrix Computations" by Gene Golub and Charles Van Loan.

For your particular problem, the matrix is already symmetric. In that case, we use the spectral decomposition $$ A = U \Lambda U^t $$ where $\Lambda$ is the diagonal eigenvalue matrix and the orthogonal $U$ is the eigenvector matrix. Since it is orthogonal, we have $U^tU = $, where $I$ is the identity matrix. Then $$ \Lambda = \left[ \begin{array}{cc} -0.6180 & 0 \\ 0 & 1.6180 \end{array} \right] $$ $$ U = \left[ \begin{array}{cc} 0.5257 & -0.8507 \\ -0.8507 & -0.5257 \end{array} \right] $$ We first want the eigenvalues in the decreasing order. Permuting the columns of $U$ and applying the same permutation on $\Lambda$, we have $$ \Lambda = \left[ \begin{array}{cc} 1.6180 & 0 \\ 0 & -0.6180 \end{array} \right] $$ $$ U = \left[ \begin{array}{cc} -0.8507 & 0.5257 \\ -0.5257 & -0.8507 \end{array} \right] $$ Note that $A=U\Lambda U^t$ is still a spectral decomposition. However, in the SVD, all the singular values have to be non-negative. To fix that we multiply the second column of $\Lambda$ by $-1$ and to compensate that action we multiply the second row of $U^t$ by $-1$. We call this matrix by $V^t$. Now we have, the SVD of $A$ as $$ A = U\Sigma V^t $$ where $$ \Sigma = \left[ \begin{array}{cc} 1.6180 & 0 \\ 0 & 0.6180 \end{array} \right] $$
$$ V = \left[ \begin{array}{cc} -0.8507 & -0.5257 \\ -0.5257 & 0.8507 \end{array} \right]. $$

Now, I hope you understand the difference between the spectral decomposition and the SVD of a symmetric matrix. The spectral decomposition and the SVD of a real symmetric have common properties but we have to be careful when we use them.

(a) The singular values are always ordered in the non-increasing order. The eigenvalues of a real symmetric matrix are (usually) ordered from smallest to the largest.

(b) If $x$ is an eigenvector of the matrix $A$ then $-x$ is also an valid eigenvector. The same is true of the singular vectors. However, if the sign of a column of $U$ is changed then the corresponding column of the $V$ has to be changed.