Inverse of diagonalizable matrix is diagonalizable

32.6k Views Asked by At

Let $A \in M_n(\mathbb C)$ be invertible. Prove that $A$ is diagonalizable if and only if $A^{-1}$ is diagonalizable.

This is what I have for one direction of the proof: Suppose $A$ is diagonalizable. Then there exists a diagonal matrix $D \in M_n(\mathbb C)$ and an invertible matrix $S \in M_n(\mathbb C)$ such that $A=SDS^{-1}$. So $$A=SDS^{-1}$$ $$A^{-1}A=A^{-1}SDS^{-1}$$ $$I_n=A^{-1}SDS^{-1}$$ $$S=A^{-1}SD$$ $$*SD^{-1}=A^{-1}S*$$ $$SD^{-1}S^{-1}=A^{-1}$$

In my deduction above, I assumed that $D$ is invertible. I know this is the case. $D$ is the diagonal matrix with entries that are just the eigenvalues of $A$. Since $A$ is invertible, $\lambda \neq 0$, so $det(D) \neq 0$ and therefore $D$ is invertible. But how can I show that the entries of $D$ are just the eigenvalues of $A$?

I already proved that for an invertible matrix $A$, $\lambda$ is an eigenvalue of $A$ if and only if $1/ \lambda$ is an eigenvalue of $A^{-1}$. Can I use this to somehow prove that the entries in $D$ are the eigenvalues of $A$?

4

There are 4 best solutions below

2
On BEST ANSWER

When you write $A = SDS^{-1}$, where $D$ is a diagonal matrix, then the diagonal are its eigenvalues and the columns of $S$ are the corresponding eigenvector, since we have $AS = SD$.

2
On

Think about the process of diagonalizing a matrix. How do you find $S$ and $D$? You have to first find the eigenvalues of $A$ and its corresponding eigenvectors. The matrix $S$ is made up of the eigenvectors and $D$ is a diagonal matrix with eigenvalues along the diagonal.

0
On

The method posted in the comment above gives an easy answer: if

$A = S D S^{-1}$,

then

$A^{-1} = (SDS^{-1})^{-1} = (S^{-1})^{-1} D^{-1} S^{-1} = S D^{-1} S^{-1}$.

Since $A$ and $S$ are invertible, so is $D = \operatorname{diag}(\lambda_1,\ldots,\lambda_n)$, and then $D^{-1} = \operatorname{diag}(\lambda_1^{-1},\ldots,\lambda_n^{-1})$. So $A^{-1}$ is diagonalizable. The other direction is, according to taste, either entirely similar or actually deduced from what we already did by plugging in $A^{-1}$ in place of $A$.

However, I want to answer the question in a slightly less easy way but which gives more. The calculation actually shows that the same matrix $S$ diagonalizes both $A$ and $A^{-1}$. (One says that $A$ and $A^{-1}$ are simultaneously diagonalizable.) What does that mean? A matrix $S$ diagonalizes a matrix $A$ if and only if the columns of $S$ are eigenvectors for $A$, so we're seeing that $A$ and $A^{-1}$ admit a common basis of eigenvectors. But that suggests an even stronger fact.

Proposition: Let $A$ be any invertible matrix.
a) A nonzero vector $v \in V$ is an eigenvector for $A$ if and only if it is an eigenvector for $A^{-1}$.
b) More precisely, zero is not an eigenvalue of either $A$ or $A^{-1}$ because they are invertible, and for any nonzero $\lambda$, the $\lambda$-eigenspace for $A$ is the $\lambda^{-1}$-eigenspace for $A^{-1}$.

In particular the sum of the dimensions of the eigenspaces for $A$ is always equal to the sum of the dimensions of the eigenspaces for $A^{-1}$. The case where these dimensions sum to $n$ recovers the case asked by the OP.

0
On

We can write the diagonalizable matrix in the basis of its eigenvectors obtaining a diagonal matrix. The inverse of a diagonal matrix is another diagonal matrix. A diagonal matrix is trivially diagonalizable. So the inverse of a diagonalizable matrix is diagonalizable. In simpler terms a diagonalizable matrix A will lengthen some eigenvectors and shorten some, the inverse A^-1 will just do the reverse shortening the ones A lengthened and lengthening the ones A shortened.