SVD. why U and V have to be both orthonormal matrices?

4.2k Views Asked by At

I'm looking for the SVD factorization $A = U D V'$ starting from the set of equations $A u = v d$ and $A' v = u d$. Where u and v are vectors from the A and A' spaces and d the singular value.

Follows that $A u u'= v d u' $. If we just consider u as being an orthonormal matrix we get $A = U D V'$. My question is, what im doing wrong (in literature they say we need both u and v have to be orthonormal).

[ then why we need to apply a gram-schmit to the eigen vectors of v ] Thanks

1

There are 1 best solutions below

1
On BEST ANSWER

If one uses the symmetric system $AV=UD$, $A'U=VD$ and assumes that $U'U=I$ is chosen, then $V'VD=V'A'U=D'U'U=D'$. Under the assumption that $D$ is invertible, and all matrices are square the orthogonality of $V$ follows.

If one of the assumption fails, the full orthogonality, resp. the orthonormality of the columns of $V$ is not required. But it helps numerical stability, and the standard algorithm for the SVD splits of rotation or reflection factors and collects them in U and V, so that $U$ and $V$ are orthogonal (isometric) by construction.