How to extend the matrix with determinant 1 to keep it

399 Views Asked by At

Lets consider 2x2 integer matrix with determinant equal 1: $$\left( \begin{array}{cc} a & b \\ c & d \\ \end{array} \right)$$

I am working on the following:

How to extend this to 3x3 matrix in order to get another matrix with determinant 1: $$\left( \begin{array}{ccc} a & b & e \\ c & d & f \\ i & h & g \\ \end{array} \right)$$

And also is there any $a,b,c,d$ for which this extension is unique. I even have no idea how to start solving this. I have discovered the following so far on the web, but not sure how to use this:

Integer matrices with determinant equal to $1$

https://mathoverflow.net/questions/24131/is-the-semigroup-of-nonnegative-integer-matrices-with-determinant-1-finitely-gen

EDITED:

Actually I am looking for general algorithm, how to construct all 3x3 matricies from 2x2 matrix with determinant 1.

EDITED 2:

Some samples of such matricies:

$$\left( \begin{array}{ccc} 1 & 1 & 1 \\ -1 & 0 & 1 \\ -1 & 0 & 2 \\ \end{array} \right) $$

$$\left( \begin{array}{ccc} 1 & 1 & 1 \\ 1 & 2 & 3 \\ 2 & 5 & 9 \\ \end{array} \right)$$

$$\left( \begin{array}{ccc} 1 & 1 & 1 \\ -6 & -5 & -4 \\ 9 & 5 & 2 \\ \end{array} \right) $$

3

There are 3 best solutions below

0
On

Hint:

Consider a block-diagonal matrix, for which the first diagonal block is $\;\begin{matrix}a&b\\c&d\end{matrix}$.

8
On

You can just set $g = 1$ and $e,f,i,h = 0$:

$$1 = \begin{vmatrix} a & b \\ c & d \\ \end{vmatrix} = \begin{vmatrix} a & b & 0\\ c & d & 0\\ 0 & 0 & 1 \end{vmatrix}$$

Furthermore, the extension is never unique since:

$$1 = \begin{vmatrix} a & b \\ c & d \\ \end{vmatrix} = \begin{vmatrix} a & b & 0\\ c & d & 0\\ 0 & 0 & 1 \end{vmatrix} = \begin{vmatrix} a & b & 0\\ c & d & 0\\ a & b & 1 \end{vmatrix}$$

Note that $a$ and $b$ cannot both be $0$, so the two extensions are indeed different.

5
On

$SL_2 \mathbb Z$ $$ \left( \begin{array}{cc} \alpha & \beta \\ \gamma & \delta \end{array} \right) $$

homomorphic image

$SL_3 \mathbb Z$ $$ \left( \begin{array}{ccc} \alpha^2 & 2 \alpha \beta & \beta ^2\\ \alpha \gamma & \alpha \delta + \beta \gamma & \beta \delta\\ \gamma^2 & 2 \gamma \delta & \delta^2 \end{array} \right) $$

If you take the transpose of the 3 by 3, you get an anti-homomorphism.