Cost Function Neural Network With Weight Derivation

131 Views Asked by At

Given this cost function of a single-layerd neural network with a sigmoid function:

$$E_j = \frac{1}{2} \sum_{k=1}^{K}(\text{target}_{jk} - \text{observed}_{jk})^2 + a\sum_{i=1}^{I}w_{ij}^2$$

I have figured out how to differentiate the left side of the + but the left side I need some help. a is a regulatory factor and $w$ is the weight of NN from $i$ to $j$

Thank you advance.