If $X \in \mathbb{R}^{n \times n}$ and invertible, does it always have an eigen-decomposition?

34 Views Asked by At

How can I prove or disprove this claim?

Since $X$ is invertible, $X$'s eigenvalue $\lambda_i \neq 0$ for all $i = 1, \cdots ,n$. However, according to Wikipedia https://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix, we need $n$ linearly independent eigenvectors in addition to $X$ being square in order to say whether or not $X$ has spectral decomposition. But we know $X$ is invertible, therefore $X$ has full rank meaning that all columns of $X$ are linearly independent, but doesn't mean we have $n$ linearly independent eigenvectors? So to me, I don't know invertible and square matrix are enough to say that $X$ has a spectral decomposition.

2

There are 2 best solutions below

1
On

Square and invertible is not enough to guarantee the existence of a spectral decomposition. The standard example is $$ X = \begin{pmatrix}1&1\\0&1\end{pmatrix} $$ which has only eigenvalue $1$ and all eigenvectors are of the form $\left(\begin{smallmatrix}a\\0\end{smallmatrix}\right)$. So there is only one linearly independent eigenvector.

1
On

If $X$ is invertible, it's determinant must be non-zero. The determinant is also equal to the product of its eigenvalues. Therefore none of the eigenvalues can be zero.