Proving symmetric matrices are diagonalizable using fact eigenvectors must be orthogonal

1k Views Asked by At

I'm treating the fact that we know eigenvectors of symmetric matrices corresponding to distinct eigenvalues must be orthogonal as a given and trying to show (real) symmetric matrices are diagonalizable.

Knowing the aforementioned fact, we can conclude that there exists an orthogonal basis of eigenvectors for any symmetric matrix. We know a matrix A is diagonalizable iff there exists a basis of eigenvectors. So therefore, symmetric matrices are diagonalizable. ∎

I'm not sure if I'm making a bit of a logical leap when I can conclude that there exists an orthogonal basis of eigenvectors, would appreciate any criticism/advice.

1

There are 1 best solutions below

0
On BEST ANSWER

One can proceed by induction:

  • The one-dimensional case is obvious, since a $1\times 1$ matrix multiplication is the same as a scalar one.
  • Suppose all $n \times n$ symmetric matrices are diagonalisable. If $A$ is $(n+1) \times (n+1)$ and symmetric, we know it has an eigenvector $v$. Suppose that $u \in v^{\perp}$, that is, the space of $u$ with $\langle u,v\rangle = 0$. Then $$ \langle u,Av\rangle = \langle u,\lambda v\rangle = \lambda \langle u,v\rangle = 0. $$ So $A(v^{\perp}) \subseteq v^{\perp}$, and therefore we can define an operator $B:v^{\perp} \to v^{\perp}$ by $Bu=Au$. This is of course also symmetric. But $v^{\perp}$ is $n$-dimensional, so given a basis of $v^{\perp}$, $B$ is represented for this basis by an $n \times n$ symmetric matrix. But this is diagonalisable by the induction hypothesis. Hence the whole operator is, since we can write it as $A = B + \lambda v\langle v, -\rangle$ where $B$ is extended to $v$ by $Bv=0$.