Linear Algebra: SVD problem

151 Views Asked by At

I have an exercise in linear algebra on the chapter about SVD:

Show that if $P$ is an orthogonal matrix of type mXm and $A$ is a matrix of type mXn, then $PA$ has the same singular values as $A$.

I have a general SVD question and would also appreciate input on how to prove the answer to the question correctly.

My questions:

  1. My book never bothered to prove why every matrix can be "singular value decompositioned" (if I can turn it into a verb). Why is it so?

The things I understand: If you have a matrix $A$, then $A^TA$ and $AA^T$ are both symmetrical and diagonalisable with $A^TA=VD_{V}V^T$ and $AA^T=UD_UU^T$, where $U,V$ are orthogonal matrices with eigenvectors as column vectors and $D_U,D_V$ are diagonal matrices with eigenvalues as elements.

If $AA^T\vec{v}=\lambda\vec{v}\Leftrightarrow A^TA(A^T\vec{v})=\lambda(A^T\vec{v})$

If $A^TA\vec{v}=\lambda\vec{v}\Leftrightarrow AA^T(A\vec{v})=\lambda(A\vec{v})$

If $\vec{v}$ is an eigenvector of $A^TA$, that means $A\vec{v}$ is an eigenvector of $AA^T$ and with the same eigenvalue.

If $AA^T$ and $A^TA$ are not the same dimensions (mXm and nXn), then it seems to me that either $A$ or $A^T$, depending on which transforms from a higher dimension to a lower, must be a projection and have a null-space. If we assume that $A$ transforms from a higher dimension to a lower, then it must transform some of $A^TA$:s orthogonal eigenvectors on to the orthogonal eigenvectors of $AA^T$ and the rest must span the null-space and be projected with eigenvalues 0. Am I correct in thinking like this.

This is as far as I've gotten.

  1. How do I solve the problem and prove it?

If we assume that $A=U\Sigma V^T$, where $U,V$ are orthogonal matrices and $\Sigma$ a "Diagonal matrix", then $PA= (PU)\Sigma V^T$ and since $P$ is an orthogonal matrix that don't change lengths of vectors or the angle between then, $PU$ will also be an orthogonal matrix. Therefore the singular values are the same. But how do I prove this?