What does the orthogonalisation step for U and V in SVD do exactly?

765 Views Asked by At

I'm reading about Singular Value Decomposition and I'm confused about one aspect. I hope someone can clarify this for me.

So the SVD looks like this: $C = U \Sigma V^T$. One always notes something like "the columns of $U$ are orthonormal eigenvectors of $A A^T$, the columns of $V$ are orthonormal eigenvectors of $A^T A$.

What I'm confused about is: what if the U and V matrices are not orthogonal? I've come along an example which calculates the eigenvectors and values of $AA^T$ and $A^T A$, which are then orthogonalized. But then U and V do not actually contain the eigenvectors as columns anymore, right? Does this have implications for SVD to be exact? Is this an approximation then?

1

There are 1 best solutions below

0
On BEST ANSWER

The spectral theorem tells us that for any symmetric matrix (such as $AA^T$ or $A^TA$) we can choose orthonormal eigenvectors. When we "orthogonalize" $U$, we start with any eigenvectors of $AA^T$, then mess around with them (that is, apply the Gram Schmidt process) to get new columns for $U$ that are both eigenvectors of $AA^T$ and orthonormal. It is possible to do this precisely because $AA^T$ is a symmetric matrix.

If the matrices $U$ and $V$ were not orthogonal, then SVD would be a lot less convenient (and useful). In particular, instead of writing $A = U\Sigma V^T$, we'd have to write $A = U \Sigma V^{-1}$. Things go further downhill if you try to apply SVD.