How do I solve $\frac{\partial}{\partial w_{jk}} { \sum_j w_{jk} . o_j }$

68 Views Asked by At

Please I want to know why $\frac{\partial}{\partial W_{jk}} { \sum_j W_{jk} . o_j }$ = $o_j$

I know $\frac{\partial}{\partial W_{jk}} { W_{jk} . o_j }$ = $o_j$ but I got confused by the summation symbol.

I was reading it from "MAKE YOUR OWN NEURAL NETWORK" book (page 96) by TARIQ RASHIQ. $W_{jk}$ represent the weight connecting node j to node k. And $o_j$ represent the output of node j

But then what I was thinking was: if the partial differentiation of $W_{jk}$ with respect to $W_{jk}$ is one , then it means we are getting $\sum1 . oj$ and $\sum1$ is 1. So finally we will get $o_j$

2

There are 2 best solutions below

0
On BEST ANSWER

Please I want to know why $\frac{\partial}{\partial W_{jk}} { \sum_j W_{jk} . o_j } = o_j$

...but I got confused by the summation symbol.

That is not unusual.   You have overloaded the $j$ token; one use is as a free token, and the other is bound to the summation.   Although this is legal, it is often a source of confusion.   For clarity you should use a fresh token in the series.   Recall that $\sum_j a_j=\sum_i a_i$ .

Then we may partition the series, extracting the one term where the series index equals the free token from all the others, where they do not match.

$$\dfrac{\partial\quad}{\partial W_{jk}} { \sum_i W_{ik} \cdot o_i } ~=~ \dfrac{\partial\quad}{\partial W_{jk}} ( W_{jk} \cdot o_j )+\sum_{i~:~i\neq j}\dfrac{\partial\quad}{\partial W_{jk}}(W_{ik}\cdot o_i)$$

You can handle the first term, while the remainder vanish.

0
On

Maybe you should rewrite the term you want to differentiate as $$ \phi = \sum_l W_{lk}o_l = \left( W_{jk}o_j + C \right) $$ where $C$ is a term that DOES not depend on $W_{jk}$.

Now the result is clear that $$ \frac{\partial \phi}{\partial W_{jk}} = o_j $$