Let $F$ be a field. I am interested in solving equations of the form $$\text{diag}(x_1,\dots,x_n) (Ax - b) = 0 $$ for $x = (x_1,\dots,x_n) \in F^n$ where $A$ is a $n \times n$-matrix over $F$ and $b \in F^n$ .
For any $J \subseteq \{1,\dots,n\}$ we can consider the matrix $A_J$ by taking only the rows and colums of $A$ corresponding to indices contained in $J$. Let $$L_J = \{x \:|\: A_J x_J = b_J \text{ and } x_i = 0 \text{ if } i \not\in J \}$$ where $x_J$ and $b_J$ are defined analogously to $A_J$. Then the set of solutions $L$ of the above equation satisfies $L = \bigcup_J L_J$. Thus, $L$ can be determined by computing $L_J$ for every $J \subseteq \{1,\dots,n\}$.
Is there a better way to do this than to calculate the $2^n$ sets $L_J$ separately?
Motivation:
If I want to know the extreme values of a function $f: \mathbb{R}^n \to \mathbb{R}$ in the first quadrant this is the same as looking for extrema of $g(x) = f(x_1^2,\dots,x_n^2)$ and the gradient of $g$ is the product of $2 \text{ diag}(x_1,\dots,x_n)$ and the gradient of $f$. If $f$ is of the form $f(x) = x^T B x + x^T b $ you essentially end up looking for solutions of equations of the sort described above.