$P(w_0 | x) = \frac{1}{1 + e^{-log\frac{P(x|w_0)}{P(x|w_1)}-log\frac{P(w_0)}{P(w_1)}}}$ Note: x = $[x_1, \dots, x_d]^T$; a $d$ dimensional vector. $w$ can take on one of two values: $w_0$ or $w_1$.
$P(x|w_i) = \frac{1}{Z}e^{.5(x-\mu_i)^T\Sigma^{-1}(x - \mu_i)}$.
Z is normalization constant. It's not going to matter because it will divide out. (Each $P(x|w_i)$ will have identical covariance matrices.)
What I want to show is $$P(w_0|x) = \frac{1}{1+ e^{-(W^Tx + b)}},$$
where $$W = \Sigma^{-1}(\mu_0-\mu_1),$$ and $$b = \frac{1}{2}(\mu_1+\mu_0)^T\Sigma^{-1}(\mu_1 - \mu_0) + log\frac{P(w_0)}{P(w_1)} $$
(I'm trying to read this paper: http://www.ics.uci.edu/~smyth/courses/cs274/readings/jordan_logistic.pdf ... relvant equations are (2) - (7).)
This is probably just some simple algebra I'm missing, but I haven't been able to come up with the derivation.