Linear equations - how to find the solution over the boolean field closest to zero

290 Views Asked by At

I want to solve a system of linear equations over the field of $F_2$, in a way such that the solution vector is as close to the zero vector as possible.

For example, suppose I have a system of equations with the augmented matrix

$$ \left[ \begin{array}{cccc|c} 1 & 0 & 0 & 1 & 1\\ 0 & 1 & 0 & 1 & 1\\ 0 & 0 & 1 & 1 & 1\\ \end{array} \right] $$

with unknowns $a, b, c,$ and $d$. We can clearly see that $a = b = c = 1 + d$, and we can choose any one variable freely. For $d = 1$, we get the solution vector $a = b = c = 0$, while for $d = 0$ we get $a = b = c = 1$. In the first case, the only 1 variable ($d$) differs from zero, while in the latter, the other 3 differ from zero.

Does anybody know if there is some method of choosing the free variables in such a way that we get the solution with the fewest non-zero elements? In this case, it would be with $d = 1$. However, if I had 25 variables and a 13 dimensional solution, the problem wouldn't be so trivial.

I tried to solve this problem, but I really didn't get anywhere. My only idea is the brute force method of checking all $2^k$ possibilities, where $k$ is the number of free variables in the solution. However, this isn't acceptable for large $k$.

Can anybody give me any hints on how this can be done (efficiently)? Is it even possible?

Thanks for any help!

-Tusike

2

There are 2 best solutions below

0
On

You can make it into an integer linear programming problem. In your example,

minimize $$a+b+c+d$$

subject to

$$\eqalign{ a + d &= 1 + 2 s_1\cr b + d &= 1 + 2 s_2\cr c + d &= 1 + 2 s_3\cr a,b,c,d \in \{0,1\}, & s_1, s_2, s_3 \in \mathbb N}$$

Cplex or similar solvers might handle problems with a few dozen variables and constraints without too much trouble.

0
On

This is equivalent to the decoding problem of a binary linear code. You have the parity check matrix augmented with the syndrome vector and you are looking for the lowest weight error vector that gives rise to the known syndrome.

This means that in full generality the problem is surely intractable. I don't remember a suitable proven result, but I do remember that the following variant has been proven to be in one of those nasty NP-categories complexitywise. The variant I remember is that assume your augmented part is all zeros, and that you are looking for a non-zero solution with as few variables $\neq0$ as possible. It is known to be NP-hard whether there exists a solution with the number of such variable below a certain bound. IIRC the result is due to Alex Vardy (UIUC).

So if you can solve this in polynomial time you could make a killing in coding theory. The approach suggested by professor Israel has been tried out. The pure simplex algorithm runs into problems in the optimal solutions having fractional entries and such. Unfortunately my exposure to that line of research is limited to a couple conference talks.