Proof of "Singular values of a normal matrix are the absolute values of its eigenvalues"

8.6k Views Asked by At

I want a simple proof of this fact using only definitions and basic facts. I've searched for it for some time and I couldn't find a satisfying proof. So I attempted to do it myself.

Let $A \in \mathbb{C}^{n \times n}$ and $A^H$ be conjugate transpose of $A$. Let $A$ be normal, i.e. $A A^H = A^H A$. The singular values of $A$ are defined as $\sigma \in \mathbb{R}^{\geq 0}$ such that

$$ \begin{align*} A v &= \sigma u \\ A^H u &= \sigma v \end{align*} $$

where $u^H u = v^H v = 1$. $u$ and $v$ are called left and right singular vectors respectively.

Now multiplying the first equation with $A^H$ and the second equation with $A$ from the left we obtain $$ \begin{align*} A^H A v &= \sigma A^H u = \sigma^2 v \\ A A^H u &= \sigma A v = \sigma^2 u \end{align*} $$

Since $A$ is normal we obtain $$ \begin{align*} A A^H v &= \sigma^2 v \\ A A^H u &= \sigma^2 u \end{align*} $$

I believe we can conclude that either $u=v$ or $u=-v$ if singular values are distinct. So by the definition we can see that either $\lambda=\sigma$ or $\lambda=-\sigma$ is also an eigenvalue of $A$. So $\sigma = | \lambda |$.

Edit: As @levap pointed out we can only conclude that $u = e^{i \theta} v, \theta \in [0, 2 \pi)$. Then we see that

$$ \begin{align*} Av &= \sigma e^{i \theta} v \\ A^H v &= \sigma e^{-i \theta} v \end{align*} $$

So, we can say that $\lambda = \sigma e^{i \theta}$ is an eigenvalue of $A$. Also, by the lemma given below $\sigma^2 \geq 0$, so $\sigma \in \mathbb{R}$ and we can always select $\sigma \geq 0$ (using $-u$ in the definition if $\sigma \leq 0$). Therefore, we can conclude that $\sigma = |\lambda|$. Also, $v$ is an eigenvector of $A^H$ with $\bar{\lambda}$.

Lemma. Eigenvalues of $AA^H$ are real and non-negative.

Proof. $0 \leq \lVert A^H v \rVert_2 = v^H A A^H v = \sigma^2 v^H v = \sigma^2$.

What happens if singular values are not distinct?

1

There are 1 best solutions below

9
On

Since you work over $\mathbb{C}$, if the singular values are distinct you can only conclude that $u = e^{i\theta}v$ but this is enough.

If the singular values are not distinct, I doubt you'll find a "completely elementary" proof which avoids something like the spectral theorem as you need to know something about the eigenvalues of $AA^H$ and their relation to the eigenvalues of $A$.

Using the spectral theorem, I can offer the following argument. If $A$ is normal and $v$ is an eigenvector of $A$ with eigenvalue $\lambda$ then $v$ is also an eigenvector of $A^H$ with eigenvalue $\overline{\lambda}$. Denote the eigenvalues of $A$ by $\lambda_1, \ldots, \lambda_n$. Since $A$ is diagonalizable, we can choose a basis $(v_1, \ldots, v_n)$ with $Av_i =\lambda_i$. Then we have

$$ AA^H(v_i) = A(\overline{\lambda_i} v_i) = \lambda_i \overline{\lambda_i} v_i = |\lambda_i|^2 v_i $$

which shows that the eigenvalues of $AA^H$ are $|\lambda_1|^2, \ldots, |\lambda_n|^2$. Since you have shown that $\sigma^2$ is an eigenvalue of $AA^H$, we have $\sigma = |\lambda_i|$ for some $1 \leq i \leq n$.