Factoring an integer matrix into unimodular and upper triangular matrices

542 Views Asked by At

I'm stuck at the following problem. I've read this somewhere, but the author did not provide a proof, probably assuming that this is `clear'. Let $A \in \mathbb{Z}^{n \times n}$. Then, there exist a unique $B \in \mathbb{Z}^{n \times n}$ and a unique $U \in \mathbb{Z}^{n \times n}$ such that

  • $\det B=1$
  • U is upper triangular, with $0 \leq u_{ij}<u_{jj}$ if $i<j$. In other words, $U$ has non-negative elements and the largest being the ones on the diagonal.
  • $ A = BU.$

I believe this problem is related to lattice reduction theory. If someone knows an argument why this is true or point out to a source where this is proved, I would be very grateful.

1

There are 1 best solutions below

1
On

You can apply a similar strategy like in the euclidean algorithm. Take the first column of $A$. Select the smallest non-zero entry. Reduce all other rows by integer multiples of this row. Now select the new smallest non-zero entry and repeat, until all but one entry in the first column are zero. Exchange that row to the first place. Then continue for the submatrix under the $(2,2)$ element.

The row reductions and permutations are elementary integer unimodular matrices, so their product $B$ will also be integer and unimodular.


See also Smith normal form