Diagonalizability of a matrix condition

136 Views Asked by At

I've recently seen the following theorem; $A$ is diagonalizable $\iff$ $A$ has $n$ linearly independent eigenvectors.

Now, assume $A$ is diagonalizable, then $\Lambda = P^{-1}AP$ for some nonsingular matrix $P$, and $\Lambda = diag(\lambda_{1}, ... , \lambda_{2})$. So it is easy to see that the columns of $P$ are eigenvectors of $A$, but why are they linearly independent?

1

There are 1 best solutions below

0
On

If you consider that the eigenvectors are linearly independent, then $P$ is a $n \times n$ matrix of rank $n$. This implies $P$ is invertible. Otherwise, $P^{-1}$ does not exist.

Note: The problem arises when you have an eigenvalue with an algebraic multiplicity more than $1$.