Let $A$ be a (skew-) symmetric matrix over $\mathbb{Z}/2$. (In fact, I would take $A$ as the linking matrix of an oriented framed link in $S^3$ or the matrix representing the intersection form on a closed smooth 4-manifold. The following statement however does seem to hold in general.) I am interested in the following linear system over $\mathbb{Z}/2$, $$a_{i1}x_1+a_{i2}x_2\cdots+a_{in}x_n=a_{ii},\quad i=1,\cdots,n.$$
This system is known to always have a solution. (c.f. Saveliev's Lectures on the Topology of 3-Manifolds.) But I cannot see why is this true unless $A$ is nonsingular over $\mathbb{Z}/2$. Is there a general method to deal with these kinds of linear systems?
This is true, but it is a bit tricky. The idea is simply to write the matrix in the form $$ A=BB^T $$ in such a way that the column space of $B$ is equal to that of $A$. All the columns of $A$ are linear combinations of columns of $B$, but it is not clear to me how to achieve the reverse inclusion (it is clearly not true for all choices of $B$).
So at this time I cannot write a completely self-contained proof, I need to refer to two articles:
IIRC only the first is needed. I include the latter, because I found the former by reading it.
The problem Lempel (of Lempel-Ziv fame) solves in the first article is the following. He wants to write a given symmetric $n\times n$ matrix $A$ over $\Bbb{Z}_2$ in the form $A=BB^T$ as efficiently as possible. That is, he wants to minimize the number of columns $m$ of $B$. His answer is that
We can apply Lempel's result to settle this question as follows.
This feels unnecessarily kludgy. The idea of using $A=BB^T$ came to me intuitively. I calculated several examples and noticed that the columns of $B$ sum up to the diagonal. Light bulb time!