How to show that a gradient is a sum of gradients?

865 Views Asked by At

I am learning about neural networks and specially back propagation.

How can one show in case a neural network appears more than once that the gradient of the loss function according to this weight matrix is a sum of two gradients? For an example, please refer to example 3.