Conditions for a matrix to be diagonalizable

3k Views Asked by At

Let $M$ be a matrix with the entries $a_{1}, \dots , a_{n}$ on the secondary diagonal (the one that ranges from $m_{n1}$ to $m_{1n}$) with all other entries being $0$. Find under which conditions the matrix is similar to a diagonal matrix.


If the matrix is similar to a diagonal matrix, it follows that $M = PAP^{-1}$ where $A$ is a diagonal matrix (namely the one with the eigenvectors of $M$) and for a matrix $P$. In other words, this means that it will be a diagonalizable matrix.

I tried to first compute the characteristic polynomial of the matrix $M$, but I don't find that helpful as we would have the determinant of a matrix with terms in its main and second diagonal, making the determinant having a complicated form (I'm sure it can be done though).

What would be the way to determine under which conditions this matrix is diagonalizable or not? I'm currently lost at this step.

2

There are 2 best solutions below

1
On

Let $p(x)$ be the characteristic polynomial of $M$. Then $discrim(p)$ is a product of terms of the form $a_i$ or $a_ia_j-a_ka_l$ (where $i,j,k,l$ are distinct) or $a_ia_j-a_k^2$ (where $i,j,k$ are distinct). Assume that the $(a_i)_i$ are generic, that is, the $(a_i)_i$ are complex numbers that are mutually transcendental over $\mathbb{Q}$. Then $discrim(p)\not=0$ and $M$ has $n$ distinct eigenvalues; therefore $M$ is diagonalizable over $\mathbb{C}$. As a consequence, if we randomly choose the $(a_i)_i$ in $\mathbb{Q}$, then $M$ is "almost always" diagonalizable over $\mathbb{C}$.

EDIT. Answer to Guy. The discriminant of a polynomial $p$ is $0$ iff $p$ admits at last one multiple root.

The eigenvalues of $M$ are: for $n=2r+1$, $a_{r+1},(\pm\sqrt{a_ia_{n-i+1}})_{i\leq r}$ and for $n=2r$, $(\pm\sqrt{a_ia_{n-i+1}})_{i\leq r}$. Then you can deduce easily the factors of $discrim(p)$.

0
On

$\begin{pmatrix}0&0&0&0&a_1\\0&0&0&a_2&0\\0&0&a_3&0&0\\0&a_4&0&0&0\\a_5&0&0&0&0\end{pmatrix}$ has characteristic polynomial $\det \begin{pmatrix}-\lambda&0&0&0&a_1\\0&-\lambda&0&a_2&0\\0&0&a_3-\lambda&0&0\\0&a_4&0&-\lambda&0\\a_5&0&0&0&-\lambda\end{pmatrix}=(\lambda-a_3)(\lambda^2-a_2a_4)(\lambda^2-a_1a_5)$. This pattern persists. So the signs of $a_na_1$, $a_{n-1}a_2$ etc determine whether you have a complete factorization over ${\Bbb R}$. Then of course distinct eigenvalues will ensure independent eigenvectors to make your matrix $P$.