Question about proving symmetric matrices are diagonalizable

200 Views Asked by At

Definition : If a n by n matrix $A$ is orthogonally congruent to another matrix $B$, then there exist an orthogonal matrix $C$ such that $$A = C^{-1}BC$$

Theorem: If $A$ is symmetric, then $A$ is orthogonally congruent to a diagonal matrix, $B = diag (\lambda_1 , .....,\lambda_n)$ where $\lambda_1,....\lambda_n$ are the (necessarily real) eigenvalues of $A$ (with multiple eigenvalues repeated in accordance with their multiplicities)

Why is this true?? If $A$ has n distinct eigenvalues its obvious that this is true, for we can have, $$C = col (\bf{v_1},v_2,....v_n)$$ where $\bf{v_1,v_2...,v_n}$ are all the eigenvectors associated with the eigenvalues, and $$ AC = CB \\ B = diag(\lambda_1,....\lambda_n) \\ A = C^{-1}BC$$ But if we do the same thing for the case where the eigenvalues are repeated, then wouldnt $C$ have multiple columns of the linearly dependent vectors? (If we use the repeated eigenvalues to solve for the eigenvectors wouldnt they be the same?) If so, $det(C) = 0$ then how isit still possible that $$A = C^{-1}BC$$ when $C^{-1}$ doesnt exist??

1

There are 1 best solutions below

0
On BEST ANSWER

There's a few points you're missing here:

  • Just because we have $n$ linearly independent vectors, doesn't mean that they form an orthonormal basis. If $A$ were any matrix with $n$ distinct eigenvalues, then $C = col(v_1, \dots, v_n)$ would have to be invertible, but is not necessarily orthogonal.
  • Some matrices have eigenspaces that are more than $1$-dimensional. As an example, $\pmatrix{1&0&0\\0&1&0\\0&0&1}$ has only one eigenvalue, but certainly has a basis of eigenvectors.

Hopefully that clears some things up. This, of course, is not a proof, but presumably this is proved in your textbook.