Invertible matrix over a local ring

137 Views Asked by At

In this paper by Kaplansky on his theorem of projective modules over local rings, he states

Hence the matrix $(c_{ij})$ is non-singular; for it has units down the main diagonal and non-units elsewhere, and over a local ring this suffices to make a matrix non-singular.

Why are matrices with units on the diagonal and non-units off the diagonal invertible in a local ring?

The last paragraph of this blog post on the paper discusses one approach but I'm still missing something. It proves that such a matrix is the sum of an invertible matrix and a matrix in the Jacobson radical, but why is that sum invertible? I think it has something to do with "$x$ is in the Jacobson radical of $R$ iff $1-xr$ is a unit for all $r \in R$."

2

There are 2 best solutions below

2
On

I believe I solved my question. The answer follows the blog post linked in the question.

Write $(c_{ij}) = D + E$ where $D$ is the diagonal matrix of units and $E$ is the remaining non-units with zeros on the diagonal. $D$ is clearly a unit. $E$ is in the Jacobson radical as it is a non-unit in a local ring. It follows that $1 + D^{-1}E$ is unit because "$x$ is in the Jacobson radical of $R$ iff $1-xr$ is a unit for all $r\in R$." Then $D(1+D^{-1}E) = D + E$ is the product of units, and thus a unit.

0
On

It is important to note that we're working with not-necessarily-commutative rings here.

We can go from the initial matrix to the identity using a sequence of row operations. Therefore, the initial matrix is invertible.

Recall that for all rings $R$, we have a few basic facts. $R^*$ is the group of units of $R$. If we have $x \in R$ and $u \in R^*$, then $x \in R^*$ iff $xu \in R^*$. A symmetrical argument shows $ux \in R^*$ iff $x \in R^*$. The special case $u = -1$ gives us that $x$ is a unit iff $-x$ is a unit.

First, let us recall that $R$ is a local ring iff $0 \neq 1$ and where for all $x$, either $x$ or $1 - x$ is a unit. Equivalently, for all $n \in \mathbb{N}$ and all $x_1, \ldots, x_n \in R$, if $\sum\limits_{i = 1}^n x_i$ is a unit, then there is some $i$ for which $x_i$ which is a unit. Thus, conversely, if all $x_i$ are non-units, then $\sum\limits_{i = 1}^n x_i$ is not a unit.

It follows that for all $x, y$ where $y$ is a non-unit, $x$ is a unit iff $x + y$ is a unit. For if $x$ is a unit, then either $x + y$ or $-y$ is a unit since $x = (x + y) - y$, and we already know $-y$ isn't a unit. And if $x + y$ is a unit, then either $x$ or $y$ is a unit, and we already know $y$ isn't a unit.

Now suppose we have some $x, y \in R$ such that $xy = 1$. I claim that $x$ is a unit. Note that this is equivalent to saying $y$ is a unit. To prove this, we use the local ring property that either $-x$ or $1 + x$ is a unit, and that either $y$ or $1 - y$ is a unit. If $-x$ is a unit, then so is $x$. We've already remarked that if $y$ is a unit, so is $x$. Suppose that both $1 + x$ and $1 - y$ are units. Then $(1 + x) (1 - y) = 1 + x - y - 1 = x - y$ is a unit, so either $x$ is a unit, or $-y$ is a unit. In the latter case, $y$ is also a unit, and thus so is $x$.

It therefore follows that for all $x, y \in R$, $xy \in R^*$ iff $x \in R^*$ and $y \in R^*$. So the product of two non-units will result in a non-unit.

With these facts in mind, we are now ready to present the algorithm. The algorithm provides a specific sequence of row operations which reduces our original matrix to the identity.

For each $i$ from 1 to $n$, we do the following:

  1. Left-multiply row $i$ by $c_{ii}^{-1}$.
  2. For each $j \neq i$ from $1$ to $n$, add $c_{ji}$ times row $i$ to row $j$.

The end result will be the identity matrix.

To demonstrate the correctness of this procedure, we verify that each row operation preserves the property that on-diagonal elements are units and off-diagonal elements aren’t.

The row operation in step 1 only modifies row $i$. We have $(c_{ij})_{new} = (c_{ij})_{old}(c_{ii})_{old}^{-1}$, so we see that $(c_{ij})_{new}$ is a unit iff $(c_{ij})_{old}$ is. So the invariant is preserved. Note that step 1 also results in $(c_{ii})_{new} = (c_{ii})_{old}(c_{ii})_{old}^{-1} = 1$.

The row operation in step 2 only modifies row $j$. We have $(c_{jk})_{new} = (c_{jk})_{old} - (c_{ji})_{old} (c_{ik})_{old}$. We need to consider 2 cases. The first case is $i = k$. In this case, we have $(c_{ji})_{new} = (c_{ji})_{old} - (c_{ji})_{old}(c_{ii})_{old} = (c_{ji})_{old} (1 - (c_{ii})_{old})$. By this stage, we have $c_{ii} = 1$, so this quantity is $0$. And since $j \neq i$, we have a non-unit in a non-diagonal place, as we should. For any other $k$, we have $(c_{ik})_{old}$ a non-unit, and therefore $-(c_{ji})_{old} (c_{ik})_{old}$ also a non-unit, as the product of non-units. Thus, $(c_{jk})_{old} - (c_{ji})_{old} (c_{ik})_{old}$ is a unit if and only if $(c_{jk})_{old}$ is a unit, which occurs if and only if $j = k$. So we have units on the diagonal entry and non-units elsewhere.

Indeed, after we have finished the iteration with $i = m$, we have that $c_{jk} = \delta_{jk}$ for all $1 \leq j \leq n$ and $1 \leq k \leq i$, where $\delta$ is the Kronecker delta. This is easy to verify: row operation 1 sets $c_{ii}$ to 1, while row operation $2$ kills off all other entries in column $i$. Thus, after completing all the iterations, we have $c_{jk} = \delta_{jk}$, so $c$ is the identity matrix.