Partial derivate in backpropagation

66 Views Asked by At

I'm studying the backpropagation algorithm and in particular I'm studying its calculus from 3Blue1Brown's videos on YouTube. I understood the overall process, but there is a partial derivate that I cannot understand: $$ \frac{∂z^{(L)}}{∂w^{(L)}} = a^{(L-1)} $$ Given: $$ z^{(L)}=w^{(L)}a^{(L-1)}+b^{(L)} $$

Why is the result simply $a^{(L-1)}$?

Thank you for your time!

1

There are 1 best solutions below

1
On

Because weights and biases are independent parameters therefore only the first term of $z^{(L)} = w^{(L)}a^{(L-1)} + b^{(L)}$ depends on $w^{(L)}$ and gives a non-zero derivative.