I was trying to understand how we can calculate gradients in back propagation in the context of neural network from here.
It says following:
The forward expression involves the variables $x,y$ multiple times, so when we perform backpropagation we must be careful to use $+=$ instead of $=$ to accumulate the gradient on these variables (otherwise we would overwrite it). This follows the multivariable chain rule in Calculus, which states that if a variable branches out to different parts of the circuit, then the gradients that flow back to it will add.
I know the chain rule says:
$$\frac{df_1}{dx}=\frac{df_1}{df_2}\frac{df_2}{df_3}...\frac{df_n}{df_x}$$
But I am not able to imagine the quote in terms of equation. Can someone put the quote in equation form? Does the above wikipedia page already states the equation, but my eyes are not able to locate / relate?