Consider the logistic regression model, where the input data is distributed as $X\sim N(0,\Sigma)$ and the labels $Y\in \{-1,1\}$ have the the following conditional distribution: $$P(Y=1|X,\theta^*)=\frac{1}{1+\text{exp}(X^T\theta^*)}$$ where $\theta^*$ is a fixed vector that we want to estimate. Suppose that we have estimated $\theta^*$ by $\hat{\theta}$.
I want to find the lower bound for generalization error in term of $||\theta^*-\hat{\theta}||_{\ell2}$. To be more precise, the generalization error is as follows:
\begin{align} P(\hat{Y}\neq Y)=&P(Y=-1,\hat{Y}=1)+P(Y=1,\hat{Y}=-1)\\ =&P(\frac{1}{1+\text{exp}(-X^T\theta^*)}<\frac{1}{2}, \frac{1}{1+\text{exp}(-X^T\hat{\theta})}>\frac{1}{2})\\ &+P(\frac{1}{1+\text{exp}(-X^T\theta^*)}>\frac{1}{2}, \frac{1}{1+\text{exp}(-X^T\hat{\theta})}<\frac{1}{2})\\ =&P(X^T\theta^*>0,X^T\hat{\theta}<0)+P(X^T\theta^*<0,X^T\hat{\theta}>0) \end{align}
But we know that $(w_1,w_2)=(X^T\theta^*,X^T\hat{\theta})$ are jointly Gaussian random variables. I was wondering if we can simplify the last expression to derive the desired lower bound.