Consider the following homogeneous, underdetermined linear system of equations:
$$ \begin{align} c + d &= k + l \\ j + k &= c + d \\ i + l &= a + b \end{align} $$
As it stands (assuming e.g. $a,b,c,d,i,j,k,l \in \mathbb{R}$), this could easily be transformed into the standard $Ax = 0$ form with $A$ being the coefficient matrix and $x$ being a solution for this system. Then, we could simply use Gauss elimination to solve this system.
However, in reality, I have the additional constraint that $a,b,c,d,i,j,k,l \in \{+1,-1\}$, which will obviously reduce the amount of suitable solutions of the above system. My question is, whether there is a good way to encode this additional constraint in e.g. a formula that can be directly considered while solving the system of equations rather than using the constraint in a second step to "screen" the solution space.
The obvious way to encode this constraint into an equation would be e.g. $a^2 = 1$, but this equation is no longer linear and therefore I would be leaving the well known realm of linear algebra for solving the resulting non-linear system of equations.
Therefore, my question is: is there a way to directly solve for the solutions that fulfill the additional constraint within the framework of linear algebra (e.g. by some clever adaptation of the Gauss algorithm)?
Note: I am looking for a way to solve this kind of system algorithmically, so my question is really aiming at "is there a standard algorithm that can solve this kind of system?".