I am trying to understand neural networks and the gradient problem.
So i try to figure out the error term from the output layer and inner layer.
equation 9.3-48:
$I_q = \sum_{k=1}^{N_k} w_{jk} \cdot O_k$
equation 9.3-54:
$\frac{\partial I_q}{\partial w_{qp}} = \frac{\partial}{\partial w_{qp}} \sum_{p=1}^{N_p} w_{qp} \cdot O_p = O_p$
Can someone explain why its $O_p$ and not $\sum_{p=1}^{N_p} O_p$. In my mind the Summation shouldn't fall out...
Also by equation 9.3-58:
$E_Q = \frac{1}{2} \cdot \sum_{q=1}^{N_Q} (r_q - O_q)^2$
$\frac{\partial E_Q}{\partial O_q} = -(r_q - O_q)$
In my mind there should be the out come $\sum -(r_q - O_q)$
I would really appreciate if someone can tell me why the summation is falling out...
I found these equations in Gonzalez/Woods - Chapter. 9.3.3, Pages 595-619 (NN)
