Consider model $logit(p)=a+bx$. I would like to get a analytic formula of $a$ and $b$ like in linear regression. In linear regression, we can get a formula of estimates of $a$ and $b$.
I tried using MLE to estimate it. But it is too complicated for me.
For estimate the values of $a$ and $b$ in your model:
$$logit(p)=a+bx^{(i)}$$
For simplify you can consider that the $a$ is multiplying by $x^{(0)}$ with value $1$, and use the matrix notation.
$$ Z = \theta\cdot x$$
In logistic regression you can use the sigmoid function as below.
$$ h_\theta (x) = \dfrac{1}{1 + e^{\theta^T.x}} $$
Now we need define one Cost Function $J(\theta)$, the MSE(Mean Square Error) is a function very used for it.
$$J(\theta) = \dfrac {1}{2m} \Big[\displaystyle (h_\theta (x) - y)^T (h_\theta (x) - y) \Big]$$
The update the values for $\theta$ is using gradient descent is define by:
$$ \theta = \theta - \gamma \dfrac{dJ(\theta)}{d\theta} $$
For calculate the gradient $\dfrac{dJ(\theta)}{d\theta}$ is used the chain rule:
$$ \dfrac{dJ(\theta)}{d\theta} = \dfrac{dJ(\theta)}{dh_\theta (x)}\dfrac{h_\theta (x)}{dZ}\dfrac{dZ}{d\theta} $$
The derivative of MSE $\dfrac{dJ(\theta)}{dh_\theta (x)}$:
$$\dfrac{dJ(\theta)}{dh_\theta (x)} = -\dfrac{1}{m}(h_\theta (x) - y)$$
Considering that the derivative of the Sigmoid Function is:
$$ \dfrac{h_\theta (x)}{dZ} = h_\theta (x)\odot(1-h_\theta (x)) $$
And the $\dfrac{dZ}{d\theta} = x$, so the final update function is:
$$ \theta = \theta + \dfrac{\gamma}{m} x^T \cdot \Big[(h_\theta (x) - y) \odot h_\theta (x)\odot(1-h_\theta (x))\Big] $$