I am following a lecture on logistic regression using gradient descent and I have an issuer understanding a short-path for a derivative :
Let be :
- $z=w_1x_1+w_2x_2+b$
- $a=\sigma(z)$
- and the loss function $L(a,y)=-y(\log(a)+(1-y)\log(1-a))$, which I know have a name but I can't remember it it.
In order to have $\frac{\delta\mathcal L(a,y)}{\delta z}$ I am able to compute $\frac{\delta\mathcal L(a,y)}{\delta a}$. But where do we get $\frac{\delta a}{\delta z}=a(1-a)$ ?
I can only have $\frac{\delta a}{\delta z}=\frac{\delta\sigma(w_1x_1+w_2x_2+b)}{\delta z}$

$$\sigma(z)=\frac{1}{1+\exp(-z)}$$
\begin{align}\frac{\partial a}{\partial z}&=\frac{\exp(-z)}{(1+\exp(-z))^2}\\ &=\frac{1}{1+\exp(-z)}\frac{\exp(-z)}{1+\exp(-z)}\\ &=\frac{1}{1+\exp(-z)}\left(1- \frac{1}{1+\exp(-z)}\right)\\ &=\sigma(z)(1-\sigma(z))\\ &=a(1-a)\end{align}