Diagonalizing a matrix where the pairwise difference of eigenvalues are units

113 Views Asked by At

Let $R$ be a commutative ring (with $1$), and $M$ be an $n\times n$ matrix over $R$ whose characteristic polynomial is $$\det(tI - M) = (t-\alpha_1)\cdots(t-\alpha_n)$$ with $\alpha_i \in R$ for all $i$. Assume that $\alpha_i - \alpha_j \in R^\times$ whenever $i\ne j$. Is it true that there is some invertible $P \in GL_n(R)$ with $$M = P\begin{pmatrix}\alpha_1 & & & \\ & \alpha_2 & & \\ & & \ddots & \\ & & & \alpha_n\end{pmatrix}P^{-1}?$$

This is true if $R$ is a field. The proof here shows that we must have $R^n = \oplus_{i=1}^n \ker(M - \alpha_i I_n)$, at least in the case when $R$ is a local ring (but also it seems to work for general $R$). But would it follow that we can write $M$ in the form above? It is not clear to me that each $\ker(M - \alpha_i I_n)$ is free (if $R$ is not local), so I am not sure if we can necessarily find a basis of $R^n$ consisting of eigenvectors of $M$.

If the above is false, might there be some conditions on $R$ that would make it true, e.g. if we assume $R$ is an algebra over a field?

2

There are 2 best solutions below

1
On BEST ANSWER

Let $A_i$ be the product of all $M-\alpha_j$ $j\neq i$. Let $K_i\subset R^n$, $K_i=A_i(R^n)$. Then $M$ acts as $\alpha_i$ on $K_i$. Next show that $\sum K_i=R^n$ using your condition $\alpha_i-\alpha_j$ are units and $K_i\cap \sum_{j\neq i} K_j=0$, so $R^n=\oplus K_i$. This gives you more or less what you want, except, the $K_i$ are projective modules of rank one, but may not be free, so not sure you will get your diagonalization globally.

0
On

Take $R = \mathbb{Z}[\sqrt{-5}]$ and $M = \begin{bmatrix}3 & 1+\sqrt{-5} \\ -1+\sqrt{-5} & -2\end{bmatrix}$. This has characteristic polynomial $t(t-1)$.

I claim that we cannot diagonalize $M$ over $R$. If we could, then the $0$-eigenspace and $1$-eigenspace must both be free of rank $1$ over $R$. Say $\begin{bmatrix} x \\ y\end{bmatrix} \in R^2$ generates the $0$-eigenspace. The $0$-eigenspace contains both $\begin{bmatrix} 1+\sqrt{-5} \\ -3\end{bmatrix}$ and $\begin{bmatrix} 2 \\ -1+\sqrt{-5}\end{bmatrix}$, so there must be some $r,s \in R$ such that $$\begin{bmatrix} 1+\sqrt{-5} \\ -3\end{bmatrix} = r\begin{bmatrix} x \\ y\end{bmatrix}, \quad \begin{bmatrix} 2 \\ -1+\sqrt{-5}\end{bmatrix} = s\begin{bmatrix} x \\ y\end{bmatrix}.$$ In particular, $ry = -3$ and $sx = 2$. Since $2$ and $3$ are irreducible in $R$ (and $\pm 1$ are the units of $R$), then $x,y$ must lie in $\mathbb{Z}$. But $$\begin{bmatrix} 0 \\ 0\end{bmatrix} = \begin{bmatrix}3 & 1+\sqrt{-5} \\ -1+\sqrt{-5} & -2\end{bmatrix} \begin{bmatrix} x \\ y\end{bmatrix} = \begin{bmatrix} 3x + (1+\sqrt{-5})y \\ (-1+\sqrt{-5})x - 2y\end{bmatrix}$$ has no integer solutions for $(x,y)$ other than $(0,0)$ (because $\sqrt{-5}$ is irrational). Contradiction.