Inverse of an Integer Matrix

550 Views Asked by At

I found a problem on the Open Problem Garden which asks about the conditions on a rectangular, full-rank, integer matrix such that its right inverse (given by: $A^T (AA^T)^{-1}$ ) is also an integer matrix. The rectangular matrix is constructed in the following way :

Let D be a square diagonal matrix (size $m \times m$) with integer elements $\geq$ 2 along the main diagonal (in order to ensure full rank and thus existence of a right inverse)

Let X be an integer matrix (size $m \times n$) with $n\geq m$.

Now, concatenate the matrices to make a new rectangular matrix M = [D X], giving it dimension $m \times (m+n)$. I am interested in the right inverse of this matrix M.

I have written code to test some matrices, and I have yet to find even one integral element, let alone an entire matrix. I've done some algebraic analysis on a general 2x4 matrix, and intuitively it looks as though some elements will never be a non-zero integer, but it is difficult to prove. If anyone has any advice on how to proceed or any insight, that would be great.

Edits: Clarified the characterization of the matrices in question. Renamed matrices for consistency.

3

There are 3 best solutions below

3
On

Here's an example for a square 2x2 matrix with each element larger than or equal to 2:

$$\begin{bmatrix}3 & 2\\4 & 3\end{bmatrix}^{-1} = \begin{bmatrix}3 & -2\\-4 & 3\end{bmatrix}$$

1
On

You can find a necessary condition for the inverse of a square integer matrix to also be an integer matrix as follows:

If $A$ is a square integer matrix then its determinant $\det (A)$ is an integer. Similarly if $A^{-1}$ exists and is an integer matrix then $\det (A^{-1})$ is an integer too. But $\det(A)\det(A^{-1}) = \det(AA^{-1}) = \det(I) = 1$. So you are looking for two integers whose product is 1 ... there are not many choices ...

0
On

If $\begin{bmatrix} R\\ S\end{bmatrix} $ is a right inverse for $M$, we have $$ I=\begin{bmatrix} D&X\end{bmatrix} \begin{bmatrix} R\\ S\end{bmatrix} =DR+XS. $$ Comparing diagonal entries, we have for all $k$ $$\tag1 D_{kk}R_{kk}+\sum_h X_{kh}S_{hk}=1 $$ We can see the above as Bézout's Identity, so a necessary condition for all entries of $R$ and $S$ to be integers is that for all $k$ the tuples $D_{kk}, X_{k1},\ldots,X_{kn}$ (i.e., the nonzero elements in each row of $M$) are coprime.

The remaining equalities are, for $k\ne j$, $$\tag2 D_{kk}R_{kj}+\sum_hX_{kh}S_{hj}=0. $$ If each $S_{hj}$ is a multiple of $D_{kk}$, we can solve for $R_{kj}$: so we get a sufficient condition if $S_{hj}$ is a multiple of $\prod_{r\ne j}D_{rr}$. This requires (by $(1)$) that $D_{11},\ldots,D_{mm}$ are coprime. Thus

A sufficient condition for $M$ to have a right-inverse with integer entries is that

  • for each $k$, the numbers $D_{kk}, X_{k1},\ldots,X_{kn}$ are coprime;

  • the numbers $D_{11},\ldots,D_{mm}$ are coprime.


To see an example of how to use this, take $$ D=\begin{bmatrix} 2&0\\0&3\end{bmatrix},\ \ \ X=\begin{bmatrix}4&5\\6&7 \end{bmatrix} . $$

We want $S_{hj}=\left(\prod_{r\ne j}D_{rr}\right)\,S_{hj}'$. So the two equalities in $(1)$ become $$ \begin{cases}1=2R_{11}+X_{11}\times D_{22}\times S_{11}'+X_{12}\times D_{22}\times S_{21}'\\ 1=3R_{22}+X_{21}\times D_{11}\times S_{12}'+X_{22}\times D_{11}\times S_{22}' \end{cases} $$ That is, we need $$ \begin{cases} 1=2R_{11}+4\times 3\times S_{11}'+5\times 3\times S_{21}'\\ 1=3R_{11}+6\times 2\times S_{12}'+ 7\times 2\times S_{22}' \end{cases} $$ For instance (among infinitely many possible choices) $$ \begin{cases} 1=2(-13)+4\times 3\times 1+5\times 3\times 1\\ 1=3\times 9+6\times 2\times (-1)+7\times 2\times (-1) \end{cases} $$ This gives us $R_{11}=-13$, $R_{22}=9$, and $$ S_{11}=3,\ S_{21}=3,\ S_{12}=-2,\ S_{22}=-2. $$ Now we can write, from $(2)$, $$ R_{12}=-(X_{11}S_{12}+X_{12}S_{22})/D_{11}=-(4\times (-2)+5\times (-2))/2=9, $$ $$ R_{21}=-(X_{21}S_{11}+X_{22}S_{21})/D_{22}=-(6\times 3+7\times 3)/3=-13. $$ So $R=\begin{bmatrix} -13&9\\ -13&9\end{bmatrix}$. Now you can check directly that $$ \begin{bmatrix} 2&0&4&5\\ 0&3&6&7\end{bmatrix} \begin{bmatrix} -13&9\\-13&9\\ 3&-2\\3&-2\end{bmatrix} =\begin{bmatrix} 1&0\\0&1\end{bmatrix}. $$


In general we may use that $(2)$ can be written as $$\tag3R_{kj}=-\frac{(XS)_{kj}}{D_{kk}},\ \ \ \ \ k\ne j.$$

For another example, if $D=\begin{bmatrix} 2&0&0\\0&3&0\\0&0&5\end{bmatrix} $ and $M=\begin{bmatrix} 5&7\\4&5\\6&7\end{bmatrix}$ the algorithm, with the choices \begin{cases} 1=2(-127) + 5\times 30+ 7\times 15 \\ 1=3(-43)+ 4\times 20 + 5\times 10\\ 1=5\times 23+6\times (-12)+7\times (-6) \end{cases} gives $R_{11}=-127$, $R_{22}=-43$, $R_{33}=23$, $S=\begin{bmatrix} 30&20&-12\\ 15&10&-6 \end{bmatrix}$ and then using $(3)$ $$ \begin{bmatrix} 2&0&0&5&7\\0&3&0&4&5\\0&0&5&6&7\end{bmatrix} \begin{bmatrix} -127&-85&51\\ -65&-43&26\\ -57&-38&23\\ 30&20&-12\\ 15&10&-6\end{bmatrix} =\begin{bmatrix} 1&0&0\\0&1&0\\0&0&1\end{bmatrix} $$


Finally, note that there are infinitely many choices for the matrix $S$ via $(1)$, so the algorithm produces infinitely many different right inverses.