Diagonalization of a Matrix and $\lambda I -A$ Rank

2.9k Views Asked by At

Let $$A = \left(\begin{matrix} -1 & 3 & -1 \\-3 & 5 & -1 \\ -3 & 3 & 1\end{matrix}\right)$$ be a matrix.

The characteristic polynomial of $A$ is:

$$(\lambda-2)^2(\lambda-1)$$

According to my professor's notes, since $\lambda=2$ is an eigenvalue of $A$ and the fact that:

$$\text{rank}(2I-A)=\text{rank}\left(\begin{matrix} 3 & -3 & 1 \\3 & -3 & 1 \\ 3 & -3 & 1\end{matrix}\right)=1$$ the matrix is diagonalizable.

However, I can't understand why.

What is the connection between the rank of $(\lambda I-A)$ and matrix diagonalization?

Thanks,

Alan

4

There are 4 best solutions below

6
On BEST ANSWER

An $n \times n$ matrix is diagonalizable if(f) we can find $n$ linearly independent eigenvectors. In particular, this is equivalent to saying that for each eigenvector $\lambda$, we have a number of eigenvalues equal to the algebraic multiplicity of that eigenvalue (i.e. the associated exponent in the characteristic polynomial).

Because the rank of $A - 2I$ is $1$, the rank-nullity theorem tells us that the nullity is $2$, which means that there are $2$ linearly independent eigenvectors associated with $\lambda = 2$. Since $1$ is an eigenvalue, we have an eigenvector associated with $\lambda = 1$. So, in total, we have $3$ linearly independent eigenvectors. So, $A$ is diagonalizable.

0
On

Since the rank is 1, you are able to have a free family of 2 $\lambda$-eigenvectors, which is a good start to find an eigenvector base (length 3). A in such a base would be a diagonal matrix

0
On

An $n \times n$ matrix is diagonalizable iff the dimensions of its eigenspaces sums to $n$. The eigenspace of the eigenvector $\lambda=2$ is spanned by vectors $x$ such that $Ax=2x$ and this means that $(A-2I)x=0$, so the space of the eigenvectors has the same dimension as the kernel of $A-2I$, and since this dimension is $2$, and the dimension of the eigenspace of $\lambda=2$ has dimension $1$, A is invertible.

1
On

Important Observation

That two eigenvalues are the same could mean we have either one or two eigenvectors.

For instance the matrix $${\bf A} = \left[\begin{array}{cc} 1&1\\ 0&1 \end{array}\right]$$

has one eigenvector $[1,0]^T$, but two eigenvalues "1". The space corresponding to this eigenvalue is a generalized eigenspace, which has one true eigenvector and other vectors which are not true eigenvectors but map up "towards" the true one.