SVD and eigenvectors

116 Views Asked by At

Let $A = U \Sigma V^{H}$ be the SVD. I must show that columns $v^{1}, \dots, v^{n} \in V$ form a complete set of eigenvectors of $AA^{H}$. Note that $\text{rank}(A) = r$ and $m \ge n \ge r$.

I simply substitute $A^{H}A$ into the eigenvector equation, and then I want to end up showing that $x = (v^{1}, \dots, v^{n}) \in V$. Well multiplying two diagonal matrices with each other, $\Sigma$, you just end up with another diagonal matrix.

$$(A^{H}A - \lambda I)x = 0\\ \iff ((V \Sigma^{H} U^{H}) (U \Sigma V^{H}) - \lambda I)x = 0\\ \iff ((V \Sigma^{H} \Sigma V^{H}) - \lambda I)x = 0$$

From here on I am a bit unsure on how to conclude that the solution for the last equation holds for $x = (v^{1}, \dots, v^{n}) \in V $

2

There are 2 best solutions below

2
On BEST ANSWER

Let me simplify the previous answer a bit. When the columns of $P$ are eigenvectors for a matrix $A$, we have $$AP = P\operatorname{diag}(\lambda_1,\dots,\lambda_n)$$ since multiplying on the right by this diagonal matrix scales column $i$ by $\lambda_i$ (left multiplication is a row operation, right multiplication is a column operation).

In our case, we have

$$(A^HA)V = V(\Sigma^H\Sigma)V^HV = V(\Sigma^H\Sigma) = V\operatorname{diag}(\sigma_1^2,\dots,\sigma_r^2,0,\dots,0).$$

So it must be that the columns of $V$ are eigenvectors with eigenvalues $\sigma_1^2, \dots, \sigma_r^2,0,\dots,0$.

2
On

If $x$ is an eigenvector of $A A^H$, then by your calculation (and my hint) $$ V \Sigma^H \Sigma V^H x = \lambda x, $$ which we can rearrange to $$ \Sigma^H \Sigma V^H x = \lambda V^H x. $$ Hence $y := V^H x$ is an eigenvector of $\Sigma^H \Sigma$ to the eigenvalue $\lambda$. Then $\lambda \ne 0$ (why?).

Since $$\Sigma^H \Sigma = \text{diag}(\sigma_1^2, \ldots, \sigma_r^2, 0, \ldots, 0)$$ is diagonal, the eigenvalue equation $\Sigma^H \Sigma y = \lambda y$ means that $$ \begin{cases} \sigma_k^2 y_k = \lambda y_k, & \text{if } k \in \{ 1, \ldots, r \}, \\ 0 = \lambda y_k, & \text{if } k \in \{ r + 1, \ldots, n \}, \end{cases} $$ and thus $\sigma_k = \lambda$ for all $k \in \{ 1, \ldots, r \}$ (which is not possible as the $\sigma_j$ are distinct) or $y_k = 0$ for all $k \in \{ 1, \ldots, r \}$ except one $y_{\ell} = 1$ (and thus $\lambda = \sigma_j^2$), and $y_k = 0$ for $k \in \{ r + 1, \ldots, n \}$ as $\lambda \ne 0$. Hence $y = e_j$ is the $j$-th unit vector for some $j \in \{ 1, \ldots, k \}$. Since the columns of $V$ are orthonormal, $y = V^H x = e_j$ implies that $x$ must be the $j$-th column of $V$.