Higman’s Lemma for Matrices?

211 Views Asked by At

It is well-known that the set of finite sequences on a finite alphabet is well-quasi-ordered by the subsequence relation.

Question: Is the set of finite matrices on a finite alphabet well-quasi-ordered by the submatrix relation? How to prove or disprove?

In other words, if $(M_1, M_2,\dots)$ is an infinite sequence of finite matrices on a finite alphabet, is it necessary that there exist $i<j$ such that $M_i$ is a submatrix of $M_j$? (For matrices $A$ and $B$, $A$ is said to be a submatrix of $B$ iff $A$ is obtainable by deleting some or no rows and/or columns of $B$.)

A general version of this question is posed (unanswered) as "Exercise 1.12 (Higman’s Lemma for Matrices?)" on p. 17 in Algorithmic Aspects of WQO Theory by Schmitz & Schnoebelen.

(I have no idea how to approach this.)

1

There are 1 best solutions below

0
On BEST ANSWER

Here's a counterexample on the alphabet $\{0, 1\}$: let $M_n$ ($n \geq 2$) be the $n \times n$ matrix $$(M_n)_{i, j} = \cases{1 & $j = i$ or $i+1$ \\ 0 & otherwise}$$ (addition/subtraction of indices is mod $n$). Note in particular that $M_n$ has exactly two $1$'s in each row and in each column.

Suppose $M_a$ is a submatrix of $M_b$ for some $a, b$. This means there are some subsets $R, C \subset \{1, \dots, b\}$ such that $M_a$ is the submatrix of $M_b$ given by removing all rows except those in $R$ and all columns except those in $C$. Note the following:

  • If $r \in R$, since there are two $1$'s in the corresponding row of $M_a$, we must have $r, r+1 \in C$, since $r, r+1$ are the only two columns of $M_b$ which have $1$'s in row $r$.
  • If $c \in C$, since there are two $1$'s in the corresponding column of $M_a$, we must have $c-1, c \in R$, since these are the only two rows of $M_b$ which have $1$'s in column $c$.

Putting these two together, this means that for $r \in R$, we have $r \in C$ and $r+1 \in C$, and the latter implies $r+1 \in R$. Thus $r \in R \Rightarrow r+1 \in R$, so since $R$ is nonempty we must have $R = \{1, \dots, b\}$. Also, since $r \in R \Rightarrow r \in C$, we have $C = \{1, \dots, b\}$ as well. It follows that $M_a = M_b$, so $a = b$. Thus no $M_a$ appears as a submatrix of $M_b$ for $a < b$.