Prove that if $A$ is normal, then eigenvectors corresponding to distinct eigenvalues are necessarily orthogonal (alternative proof)

22.8k Views Asked by At

The problem statement is as follows:

Prove that for a normal matrix $A$, eigenvectors corresponding to different eigenvalues are necessarily orthogonal.

I can certainly prove that this is the case, using the spectral theorem. The gist of my proof is presented below.

If possible, I would like to find a simpler proof. I was hoping that there might be some sort of manipulation along these lines, noting that $$ \langle Av_1,A v_2\rangle = \langle v_1,A^*Av_2\rangle = \langle v_1,AA^*v_2\rangle = \langle A^* v_1,A^* v_2 \rangle $$

Any ideas here would be appreciated.


My proof:

Let $\{v_{\lambda,i}\}$ be an orthonormal basis of eigenvectors (as guaranteed by the spectral theorem) such that $$ A v_{\lambda,i} = \lambda v_{\lambda,i} $$ Let $v_1,\lambda_1$ and $v_2,\lambda_2$ be eigenpairs with $\lambda_1 \neq \lambda_2$. We may write $ v_1 = \sum_{i,\lambda}a_{i,\lambda}v_{i,\lambda} .$ We then have $$ 0 = Av_1 - \lambda_1 v_1 = \sum_{i,\lambda}(\lambda - \lambda_1)a_{i,\lambda}v_{i,\lambda} $$ So that $a_{i,\lambda} = 0$ when $\lambda \neq \lambda_1$. Similarly, we may write $v_2 = \sum_{i,\lambda}b_{i,\lambda}v_{i,\lambda}$, and note that $b_{i,\lambda} = 0$ when $\lambda \neq \lambda_2$. From there, we have $$ \langle v_1,v_2 \rangle = \sum_{i,\lambda}a_{i,\lambda}b_{i,\lambda} $$ the above must be zero since for each pair $i,\lambda$, either $a_{i,\lambda}=0$ or $b_{i,\lambda} = 0$.

6

There are 6 best solutions below

11
On BEST ANSWER

Assume $\;\lambda\neq \mu\;$ and

$$\begin{cases}Av=\lambda v\;\,\implies\; A^*v=\overline \lambda v\\{}\\Aw=\mu w\implies A^*w=\overline\mu w\end{cases}$$

From this we get:

$$\begin{cases}\langle v,Aw\rangle=\langle v,\mu w\rangle=\overline\mu\langle v,w\rangle\\{}\\ \langle v,Aw\rangle=\langle A^*v,w\rangle=\langle\overline\lambda v,w\rangle=\overline\lambda\langle v,w\rangle \end{cases}$$

and since $\;\overline\mu\neq\overline\lambda\;$ , we get $\;\langle v,w\rangle =0\;$

Question: Where did we use normality in the above?

1
On

Specializing your identity to $v_1=v_2=v$, we get $\|Av\|=\|A^*v\|$. Hence $\ker A=\ker A^*$. Recalling that $\ker A^* = (\operatorname{ran} A)^\perp$ for general $A$, we conclude that the kernel and range of a normal matrix are mutually orthogonal.

It remains to apply the above conclusion to $A-\lambda I$ where $\lambda$ is an eigenvalue of $A$.

4
On

I try to give another simple proof to $$T^*v=\bar{\lambda}v ~\text{ if }~ Tv=\lambda v$$ where $T$ is a normal operator on a Hilbert space $H$.

Suppose $V=\ker(T-\lambda I)$. Since $T^*$ communicate with $T$, $$T^*V\subset V.$$ Because $$\langle v,T^*v\rangle =\langle Tv,v\rangle =\langle \lambda v,v\rangle=\langle v,\bar{\lambda}v\rangle ~~\forall v \in V, $$ $\langle u,T^*v\rangle =\langle u, \bar{\lambda}v\rangle ~\forall u,v\in V$ by polarisation identity, and thus $T^*v=\bar{\lambda}v.$

REMARK: Let $\sigma:V\times V\to W$ be a sesquilinear form, where $V$ and $W$ are linear vector spaces over $\mathbb{C}$. The follwing formula is called Polarisation Identity : $$\sigma(u,v)=\sum_{k=0}^3 i^k\sigma(u+i^k v, u+i^kv). $$

1
On

linear_algebra_done_right This is from linear algebra done right. btw, it's the greatest book about linear algebra I've ever seen!


Proof: Suppose $\alpha,\beta$ are distinct eigenvalues of $T$, with corresponding eigenvectors $u,v$. Thus, $Tu = \alpha u$ and $Tv = \beta v$. From 7.21 we have $T^*v = \bar \beta v.$ Thus, \begin{align} (\alpha - \beta)\langle u,v \rangle &= \langle \alpha u,v \rangle - \langle u, \bar \beta v \rangle \\ &= \langle Tu,v \rangle - \langle u, T^*v \rangle \\ & = 0. \end{align} Because $\alpha \neq \beta$, the equation above implies $\langle u,v \rangle = 0$. Thus, $u$ and $v$ are orthogonal, as desired.

0
On

I'll use the notation from Introduction to Linear Algebra by Strang. Let $A^H$ be the transpose conjugate of $A$, and suppose $A$ is normal, i.e. $AA^H = A^H A$.

Examine an eigenpair $Ax = \lambda x$. Multiply by $A^H$ and we have $A^H Ax = AA^H x = \lambda A^H x$. Assume only one $x$ with eigenvalue $\lambda$ (see note at end), so here we know that $A^H x$ is a multiple of $x$, i.e. $A^H x = c x$. Multiply by $x^H$ and we have $c = \bar \lambda$, i.e. the conjugate of $\lambda$.

Now consider another eigenpair $A y = \mu y$ with $\mu \neq \lambda$. From the result in the previous paragraph, we have $y^H A = \mu y^H$. Now multiply by $x$ and we obtain $\mu y^H x = \lambda y^H x$, which leads to $y^H x = 0$.

Note: if there are multiple $x$ with eigenvalue $\lambda$, then choose $x$ to be the linear combination that is also an eigenvector of $A^H$. Such a choice is always be possible given that there's at least one eigenvector with eigenvalue $\lambda$ (proof is left as an exercise for the reader).

6
On

The answers to this question are very good but they are glossing over the fact that if $A \in \mathbb C^{n \times n}$ is normal then $Ax = \lambda x \implies A^*x = \bar \lambda x$. I think it's helpful to just spell out the whole thing explicitly.

Lemma: If $M \in \mathbb C^{n \times n}$ is normal, then $M$ and $M^*$ have the same null space.

Proof: Let $M \in \mathbb C^{n \times n}$ be a normal matrix. Then \begin{align} &M^*x = 0 \\ \iff & \| M^* x \|^2 = 0 \\ \iff & \langle M^*x, M^* x \rangle = 0 \\ \iff & \langle M M^* x, x \rangle = 0 \\ \iff & \langle M^* M x, x \rangle = 0 \\ \iff & \langle Mx, Mx \rangle = 0 \\ \iff & \| Mx \|^2 = 0 \\ \iff & Mx = 0. \end{align}

Lemma: If $A \in \mathbb C^{n \times n}$ is normal then $Ax = \lambda x \implies A^* x = \bar \lambda x$.

Proof: Suppose that $A \in \mathbb C^{n \times n}$ is normal and $Ax = \lambda x$. So $x$ is a null vector of $M = A - \lambda I$. Note that $M$ is normal, as you can check by expanding both $M^* M$ and $M M^*$. By the above lemma, $x$ is also a null vector of $M^* = A^* - \bar \lambda I$. Thus $A^*x = \bar \lambda x$.

Theorem: If $x$ and $y$ are eigenvectors corresponding to distinct eigenvalues of a normal matrix $A \in \mathbb C^{n \times n}$, then $\langle x, y \rangle = 0$.

Proof: Let $A \in \mathbb C^{n \times n}$ be a normal matrix, and suppose that $x$ and $y$ are eigenvectors of $A$ corresponding to distinct eigenvalues $\lambda$ and $\gamma$, respectively. Note that $$\langle x, Ay \rangle = \langle x, \gamma y \rangle = \bar \gamma \langle x, y \rangle.$$ On the other hand, $$ \langle x, Ay \rangle = \langle A^* x, y \rangle = \langle \bar \lambda x, y \rangle = \bar \lambda \langle x, y \rangle. $$ So we find that $\bar \gamma \langle x, y \rangle = \bar \lambda \langle x, y \rangle$, which implies that $\langle x, y \rangle = 0$.