Why does an algebraic multiplicity of n imply an n-dimensional eigenspace for a Hermitian matrix?

194 Views Asked by At

I want to prove that given any Hermitian operator, we can find an orthonormal eigen basis for it. It is obvious there are $n$ eigenvalues counting multiplicities, and it is easy to prove that any two distinct eigenvalues have orthogonal eigenvectors. But I run into trouble with algebraic multiplicities greater than one. If we assume that there are $n$ distinct eigenvectors for an eigenvalue with multiplicity of $n$, then we can again find orthogonal eigenvectors. But everywhere I look immediately assumes that there are $n$ distinct eigenvectors. Is it obvious that an $N \times N$ Hermitian matrix has $N$ eigenvectors?

Some proofs I've seen simply state that a Hermitian operator is diagonalizable, and thus must have n linearly independent eigenvectors with which the gram-schmidt can be applied for orthogonality. But when I look up proofs of diagonlizability, such as this one, it assumes we have $n$ orthogonal eigenvectors. So pretty much useless for me.

I may be missing something crucial here but I can't figure out how to prove there are $N$ linearly independent eigenvectors, which seems to amount to proving that the eigenspace of any eigenvalues with multiplicities of $n$, is $n$-dimensional. If someone knows of a way to prove this it would be greatly appreciated.

1

There are 1 best solutions below

0
On

Theorem: A matrix $A$ can be diagonalized (i.e., has a full basis of eigenvectors) iff its minimal polynomial factors into a product of distinct linear factors.

Proof: You only need the 'if' part. To see why the 'if' part is true, suppose $A$ has a minimal polynomial that factors into the product of distinct linear factors $$ m(\lambda)=(\lambda-\lambda_{1})(\lambda-\lambda_{2})(\cdots)(\lambda-\lambda_{k}). $$ Let $p_{j}$ the product of all linear factors of $m$ except for $(\lambda-\lambda_{j})$. Then $(A-\lambda_{j}I)p_{j}(A)x=0$ which means that either $x_{j}=p_{j}(A)x=0$ is $0$ or $Ax_{j}=\lambda_{j}x_{j}$. $A$ has a basis of eigenvectors because every $x$ can be written as a linear combination of such eigenvectors $x_{j}=p_{j}(A)x$. In fact, $$ x = \frac{1}{p_{1}(\lambda_{1})}x_{1}+\cdots+\frac{1}{p_{k}(\lambda_{k})}x_{k}. $$ To see why this is true, notice that the following polynomial is a $k-1$ order polynomial that equals $1$ at $\lambda_{1},\lambda_{2},\cdots,\lambda_{k}$ and, therefore, must be identically $1$: $$ \frac{1}{q_{1}(\lambda_{1})}q_{1}(\lambda)+\cdots \frac{1}{q_{k}(\lambda_{k})}q_{k}(\lambda). \;\;\;\Box $$

Theorem: If $A$ is a Hermitian matrix and if $(A-\lambda I)^{2}x=0$ for some $x\ne 0$, then $(A-\lambda I)x=0$. Therefore the minimal polynomial for $A$ is the product of distinct linear factors.

Proof: Let $A$ be as stated, and suppose that $(A-\lambda I)^{2}x=0$ for some $x \ne 0$. All eigenvalues of $A$ must be real; so we may assume $\lambda$ is real. Using inner products, $$ 0= ((A-\lambda I)^{2}x,x)=((A-\lambda I)x,(A-\lambda I)x)=\|(A-\lambda I)x\|^{2} \implies (A-\lambda I)x=0. $$