Eigendecomposition: does the eigenvectors matrix always have an inverse?

218 Views Asked by At

Eigenvectors are the vectors who after multiplying them by matrix it is the same as if you multiplied them by a constant (the eigenvalue): $$ A \vec{v} = \lambda \vec{v} $$ If you concatenate all the eigenvectors to a matrix $Q$, and place the eigenvalues in a diagonal matrix $\Lambda$, you can write it down like this: $$ A Q = Q \Lambda \\ A = Q \Lambda Q^{-1} $$

My question is if Q is guaranteed to be invertible?

2

There are 2 best solutions below

0
On BEST ANSWER

You ought to be a bit careful about saying "all the eigenvectors". There are a lot of them. But a maximal, linearly independent subset of them works.

Say $A$ is $n\times n$. If $A$ has $n$ distinct eigenvalues, then the collection of $n$ eigenvectors that make up the columns of $Q$ must necessarily be linearly independent, so $Q$ is invertible.

If $A$ doesn't have $n$ distinct eigenvalues, then it may still be possible to find $n$ linearly independent eigenvectors. But there is no guarantee.

For instance, consider the matrix $$ \pmatrix{1&1\\0&1} $$ It has $1$ as its only eigenvalue, and it's impossible to find $2$ linearly independent eigenvectors: the only eigenvectors lie on the span of $\left(\begin{smallmatrix}1\\0\end{smallmatrix}\right)$. So in this case you cannot find such a $Q$.

0
On

No, take for example the matrix $A \in \mathbb{R}^{2 \times 2}$ given by

$$ A:= \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix} ~~.$$

All its eigenvectors are of the form $\begin{pmatrix} \lambda \\ 0 \end{pmatrix}$ with $\lambda \in \mathbb{R}$. Write two of those as the columns of your matrix $Q$, then $Q$ will not be invertible since its columns are linearly dependent.

What you want is true precisely when $A$ is diagonalizable i.e. when there exists a basis of eigenvectors. Even then you cannot just choose any eigenvectors, but linearly independent ones (which is what I guess you meant) i.e. choose basisvectors of the eigenspaces.