Minimal word length of factorization of invertible matrices into elementary matrices

154 Views Asked by At

Let $K$ be a field. As is well known, one can decompose every matrix $A \in GL(n,K)$ into a product of elementary matrices. By an elementary matrix, I mean a matrix which belongs to one of the following types of matrices, which correspond to elementary row resp. column operations (depending on the side from which you multiply one of these matrices):

I) $D_{j, \lambda} = diag(1,...,1, \lambda, 1,..., 1)$, a diagonal matrix which has $\lambda \in K \setminus \{ 0 \}$ as its $j$-th diagonal entry, or

II) $E_{ij}(\lambda)$ for $i \neq j$ and $\lambda \in K \setminus \{0\}$, which is the $n \times n$-identity matrix plus the matrix which has $(i,j)$-th entry equal to $\lambda$ and $0$ otherwise.

Now, given $A \in GL(n,K)$, is there a formula (or, at least a reasonable lower bound) for the minimal length of a factorization of $A$ into elementary matrices?


What I have found: Let $E(n,K)$ be the subgroup of $GL(n,K)$ generated by the matrices $E_{ij}(\lambda)$. Then Hinson proves that if $A \in E(n, K)$, then $A$ is a product of at most $n^2+n-2$ matrices of the type II).

1

There are 1 best solutions below

0
On

I will just discuss worst-case bounds over all $A,$ not whether we can bound the length given $A.$

For fixed finite fields $\frac{n^2}{\log_{|K|}n}(1+o(1))$ elementary matrices are necessary and sufficient as $n\to\infty.$ See The asymptotic complexity of matrix reduction over finite fields.

For infinite $K,$ by counting dimensions there is some $A$ for which we need at least $n^2$ elementary matrices, and $n^2$ is achieved exactly by Gaussian elimination.