Note: this question is related to the maths of Neural Nets, if you need clarification about the question do comment.
Raul Rojas' Neural Networks A Systematic Introduction, section 8.1.2 relates off-line backpropagation and on-line backpropagation with Gauss-Jacobi and Gauss-Seidel methods for finding the intersection of two lines.
What I can't understand is how the iterations of on-line backpropagation are perpendicular to the (current) constraint. More specifically, how is $\frac12(x_1w_1 + x_2w_2 + y)^2$'s gradient, $(x_1,x_2)$, normal to the constraint $x_1w_1 + x_2w_2 = y$?
If you choose two points $(w_1, w_2), (v_1, v_2)$ along this line, then $$(x_1, x_2) \cdot ((w_1, w_2) - (v_1, v_2)) = x_1 w_1 + x_2 w_2 - (x_1 v_1 + x_2 v_2) = y - y = 0.$$ That is, the direction $(x_1, x_2)$ is perpendicular to any vector lying along the line, i.e. $(x_1, x_2)$ is normal to the line.