In the pdf supervised learning from Andrew Ng, p. 23, it is said that if $$\eta = \log(\frac{\phi}{1-\phi})$$ then $$ \phi = \frac{1}{1+e^{-\eta}} $$ I can we prove it step by step?
I tried backward but got stuck \begin{align} \frac{1}{1+e^{-\eta}}&=\phi\\ \implies 1&=\phi(1+e^{-\eta})\\ &= \phi + \frac{\phi}{e^{\eta}}\\ &= \frac{e^{\eta}\phi + \phi}{e^{\eta}}\\ &= \phi\frac{e^{\eta} + 1}{e^{\eta}} \end{align}
Edit (with help of posted answers) \begin{align} 1&= \phi\frac{e^{\eta} + 1}{e^{\eta}}\\ \implies \frac{1}{\phi} &= \frac{e^{\eta} + 1}{e^{\eta}}\\ \implies \phi &= \frac{e^{\eta}}{e^{\eta} + 1}\\ &= \frac{e^{\eta}}{e^{\eta} + 1}\frac{\frac{1}{e^{\eta}}}{\frac{1}{e^{\eta}}}\\ &=\frac{1}{1+ e^{-\eta}} \end{align}
$$\eta=\log\frac{\phi}{1-\phi}$$
$$\exp\eta=\frac{\phi}{1-\phi}$$
$$(1-\phi)\exp\eta=\phi$$
$$\exp\eta=\phi(1+\exp\eta)$$
$$\phi=\frac{\exp\eta}{1+\exp\eta}$$ $$\phi=\frac{1}{1+\exp(-\eta)}$$
Side note: in statistics, the function $\phi\to\log\frac{\phi}{1-\phi}$ is called the logit function, and $\eta\to\dfrac{1}{1+\exp(-\eta)}$ is called the logistic function. They are used for logistic regression, for instance.