How is $(x_1,x_2)$ normal to $x_1w_1 + x_2w_2 = y$?

27 Views Asked by At

Note: this question is related to the maths of Neural Nets, if you need clarification about the question do comment.

Raul Rojas' Neural Networks A Systematic Introduction, section 8.1.2 relates off-line backpropagation and on-line backpropagation with Gauss-Jacobi and Gauss-Seidel methods for finding the intersection of two lines.

What I can't understand is how the iterations of on-line backpropagation are perpendicular to the (current) constraint. More specifically, how is $\frac12(x_1w_1 + x_2w_2 + y)^2$'s gradient, $(x_1,x_2)$, normal to the constraint $x_1w_1 + x_2w_2 = y$?

2

There are 2 best solutions below

0
On BEST ANSWER

If you choose two points $(w_1, w_2), (v_1, v_2)$ along this line, then $$(x_1, x_2) \cdot ((w_1, w_2) - (v_1, v_2)) = x_1 w_1 + x_2 w_2 - (x_1 v_1 + x_2 v_2) = y - y = 0.$$ That is, the direction $(x_1, x_2)$ is perpendicular to any vector lying along the line, i.e. $(x_1, x_2)$ is normal to the line.

0
On

Suppose you earmark two points in the plane $x\cdot w=y$. The path between them is a vector $dw$ satisfying $x\cdot dw=0$. Therefore, $x$ is normal to any such path.