Given variables $a_1, a_2, ..., a_n$ and a set of (linear) equations between them, how do I find the minimum subset of these equations that derive a given (valid) equation.
Example: Given $a_1, a_2, ..., a_{10}$, and these 4 equations:
$$1. \ a_1 - a_2 = a_3 - a_4$$
$$2. \ a_3 - a_4 = a_5 - a_6$$
$$3. \ a_5 - a_6 = a_1 - a_7$$
$$4. \ a_8 - a_9 = a_{8} - a_{10}$$
Given the valid equation $a_2 = a_7$, the minimum subset of equations that derive this is $\{1, 2, 3\}$.
Given the valid equation $a_1 - a_5 = a_2 - a_6$, the minimum set is $\{1, 2\}$.
Given the valid equation $a_9 = a_{10}$, the answer is $\{4\}$
This question arises in an engineering problem I'm working on, the set of linear equations are always in the form of either (1) $v_a - v_b = v_c - v_d$ where $a, b, c, d$ are not necessarily different, or (2) $v_a - v_b = 1$, and the target equation is always either $v_p = v_q$ or $v_x - v_y = v_m - v_n$.
I figured if there is a principle that solves this, then it should work the same for general linear equations as well, hence the generality of the question.
Of course there is the option of brute-forcing through all the subsets but I got a feeling that there should be some clever linear algebra procedure or matrix decomposition that derives this subset.
For linear equations with only integer coefficients, you can solve the problem via integer linear programming as follows. Rewrite the equations as $\sum_j c_{ij} a_j = d_i$, where $c_{ij}\in\mathbb{Z}$ and $d_i\in\mathbb{Z}$, and rewrite the target equation as $\sum_j b_j a_j = 0$, where $b_j\in\mathbb{Z}$. For equation $i$, introduce nonnegative integer decision variables $x_i$ and $y_i$ to indicate whether to multiply equation $i$ by a positive or negative integer, respectively. The problem is to minimize $\sum_i (x_i+y_i)$ subject to linear constraints: \begin{align} \sum_i c_{ij}(x_i-y_i) &= b_j &&\text{for all $j$}\\ \sum_i d_i(x_i-y_i) &= 0 \end{align}