Suppose $X$ and $Y$ are two random variables. In the context of regression, we assume that the mean of $Y$ is a function of $X$, and $Y$ varies around its mean according to some noise distribution which is independent of $X$. That is,
$$Y = E(Y \mid X) + \varepsilon$$
where $\varepsilon$ and $X$ are independent.
I am trying to understand what the terms $E(Y \mid X)$ and $\varepsilon$ are if we model $Y$ as Bernoulli, using the typical GLM / logistic regression formulation. If $Y$ is Bernoulli with parameter $\phi$, we can write the distribution $p(y; \phi)$ in the form of an exponential family as:
$$p(y; \phi) = \phi^y (1 - \phi)^{1 - y} = \exp\left\{y \log(\frac{\phi}{1 - \phi}) + \log(1 - \phi)\right\}.$$
Here, $E(Y) = \phi$, the link function is $g(\mu) = \log(\frac{\mu}{1 - \mu})$, and we assume that $\eta = \log(\frac{\mu}{1 - \mu}) = \beta x$, so that the natural parameter is linear in $X$.
I think this means that $$Y = \frac{1}{1 + \exp(-\beta X)} + \varepsilon$$ but I don't know what the distribution of $\varepsilon$ is.