Diagonalization of a simple matrix. Arriving at a contradiction

63 Views Asked by At

Consider the matrix \begin{equation} \begin{pmatrix}1 & 1 & 1\\ 1 & 1 &1\\ 1 & 1 &1\end{pmatrix}.\end{equation} The rank of this matrix is $1$ which is less than $3$. So all three eigenvectors are not linearly independent. Therefore, it seems that $A$ cannot be diagonalized by a similarity transformation $S^{-1}AS$ since $S$ is not invertible.

However, the minimal polynomial of $A$ has simple roots. They are $0$ and $3$. Hence, according to the condition of diagonalizability, this should diagonalizable.

Why am I arriving at such a contradiction? Where am I making a mistake?

3

There are 3 best solutions below

5
On BEST ANSWER

The eigenvalues are $\{3,0,0\}.$ So $0$ is a degenerate eigenvalue, and not a simple root of the characteristic polynomial. What is the dimension of its corresponding eigenspace? According to Mathematica, the eigenvalues corresponding to $\{3,0,0\}$ are: $$\left\{\left[\begin{matrix}1\\1\\1\end{matrix}\right],\left[\begin{matrix}-1\\0\\1\end{matrix}\right],\left[\begin{matrix}-1\\1\\0\end{matrix}\right]\right\}. $$ As these are clearly linearly independent, the geometric multiplicity of the eigenspace corresponding to $\lambda=0$ is therefore $2,$ making your matrix diagonalizable.

0
On

The rank of this matrix is 1 which is less than 3. So all three eigenvectors are not linearly independent.

Why not?

Existence of a full set of independent eigenvectors is completely separate from whether or not the matrix is singular (there is nothing special about zero eigenvalues, vs. non-zero eigenvalues). Consider that the zero matrix has an independent set of eigenvectors, and the classic nonsingular defective matrix $$\left[\begin{array}{cc}1 & 1\\0 & 1\end{array}\right]$$ does not.

5
On

You wrote "So all three eigenvectors are not linearly independent". @AdrianKeister's answer provides you three independent eigenvectors, so this conclusion is evidently incorrect.

What made you think that it was true?

Was it, possibly, knowing that if the $n$ eigenvalues of and $n \times n$ matrix are distinct, then the eigenvectors are independent? That's a statement of the form $p \implies q$, where $p$ is "the $n$ eigenvalues of and $n \times n$ matrix are distinct" and $q$ is "the eigenvectors are independent". It's not logically allowable to conclude from this that $\sim p \implies \sim q$, as you seem to have done. To see this, consider the case where $p$ is "I am John Hughes" and $q$ is "I live in Rhode Island".

The first does indeed imply the second, but is the statement

"If I am not John Hughes, then I do not live in Rhode Island" 

true? If so, our governor Gina Raimondo is in for a big surprise...

====

More seriously, your comments on some of the answers indicate that you have a great many false beliefs about eigenvalues and eigenvectors. I'd suggest a serious re-study of that material.