Is this $3\times3$ matrix diagonalizable?

100 Views Asked by At

After browsing through similar posts, I was wondering if I am understanding the meaning of "$n$ distinct eigenvalues" for the following theorem.

If the $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then the corresponding eigenvectors are linearly independent and $A$ is diagonalizable.

$A = \begin{bmatrix} 3 & 2 & 1 \\ 0 & 0 & 2 \\ 0 & 2 & 0 \\ \end{bmatrix}$

$\lambda I - A = 0 = \begin{bmatrix} \lambda - 3 & -2 & -1 \\ 0 & \lambda - 0 & -2 \\ 0 & -2 & \lambda - 0 \\ \end{bmatrix} = (\lambda - 3)\begin{vmatrix} \lambda & -2 \\ -2 & \lambda \\ \end{vmatrix} = (\lambda - 3) (\lambda ^ 2 - 4 ) $

I get $\lambda = 3, 2, -2$. I tried to find an eigenvector with $\lambda = 3$ and got a weird looking matrix that doesn't look linearly independent. Despite this weird matrix, I rearranged the rows via column operations and managed to find an eigenvector. It appears the matrix is diagonalizable? I am not sure I handled $\lambda = 3$ correctly though. $3I - A = \begin{bmatrix} 0 & -2 & -1 \\ 0 & 3 & -2 \\ 0 & -2 & 3 \\ \end{bmatrix} \rightarrow \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \\ \end{bmatrix} \rightarrow \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \\ \end{bmatrix} = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \end{bmatrix} = \begin{bmatrix} 0x_2 + 0x_3 \\ 0x_3 \\ x_3 \\ \end{bmatrix} = x_3 \begin{bmatrix} 0 \\ 0 \\ 1 \\ \end{bmatrix}$

$2I - A = \begin{bmatrix} -1 & -2 & -1 \\ 0 & 2 & -2 \\ 0 & -2 & 2 \\ \end{bmatrix} \rightarrow \begin{bmatrix} 1 & 0 & 3 \\ 0 & 1 & -1 \\ 0 & 0 & 0 \\ \end{bmatrix} \rightarrow \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \end{bmatrix} = \begin{bmatrix} 3_x3 \\ -x_3 \\ x_3 \\ \end{bmatrix} = x_3 \begin{bmatrix} -3 \\ 1 \\ 1 \\ \end{bmatrix}$

$-2I - A = \begin{bmatrix} -5 & -2 & -1 \\ 0 & -2 & -2 \\ 0 & -2 & -2 \\ \end{bmatrix} \rightarrow \begin{bmatrix} 1 & 0 & -.2 \\ 0 & 1 & 1 \\ 0 & 0 & 0 \\ \end{bmatrix} \rightarrow \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \end{bmatrix} = \begin{bmatrix} -.2x_3 \\ x_3 \\ x_3 \\ \end{bmatrix} = x_3 \begin{bmatrix} .2 \\ -1 \\ 1 \\ \end{bmatrix}$

$P = \begin{bmatrix} 0 & -3 & .2 \\ 0 & 1 & -1 \\ 1 & 1 & 1 \\ \end{bmatrix}$, $P^{-1}AP = \begin{bmatrix} 3 & 0 & 1.4e^{-13} \\ -.5 & 2 & 0 \\ -2.5 & 0 & -2 \\ \end{bmatrix}$

2

There are 2 best solutions below

6
On BEST ANSWER

Your error is in the first eigenvector: $$ \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} $$ gives $x_2=0$ and $x_3=0$, with $x_1$ free; thus an eigenvector is \begin{bmatrix} 1\\0\\0\end{bmatrix} You cannot transform it into $$ \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{bmatrix} $$ which is a very different linear system.

5
On

You have three eigenvalues; $\lambda^2-4$ has roots of both $2$ and $-2$.

Additionally, it is not the matrices that will be independent, but the eigenvectors. That "weird" matrix is going to tell you what the eigenvector associated with $\lambda=3$ is, as soon as you find its nullspace.