I am trying to prove the following theorem:
There exists a universal constant $K>0$ such that if $g,h$ are standard Gaussians with $\mathsf{E} gh=1-\alpha$,
$$\mathsf{P}\left(g< -1,h> 1\right)\ge K\alpha.$$
Note that without loss of generality, we can assume that $g=\langle u,X\rangle$ and $h=\langle v,X\rangle$, where $u$ and $v$ are unit vectors in $\mathbb{R}^2$ such that $\langle u,v\rangle=1-\alpha$ and $X$ is a 2D-standard Gaussian. So it is basically a geometric statement about the Gaussian area of a certain portion of the plane.
If we replace $-1$ and $1$ by $0$, then I can prove that
$$\mathsf{P}\left(g<0,h>0\right)=\frac{\arccos(1-\alpha)}{2\pi}$$
using rotational invariance of Gaussians. Is it possible to extend this reasoning to prove the above theorem?
There is no positive $K$ that will make the inequality true for all $\alpha \in (0,2)$.
This proof works more generally to show that for any positive $\epsilon$, there is no positive $K$ for which the inequality $$P[X<-\epsilon,Y>\epsilon]\ge K \alpha$$ holds for all $\alpha \in (0,2)$.
Divide both sides by $\alpha$ to re-write the inequality
$$\frac{P[X<-\epsilon,Y>\epsilon]}{\alpha} \ge K $$
The probability in the numerator can be written as
$$\int_{-\infty}^{-\epsilon} \int_{\epsilon}^{\infty}f_{X,Y,\alpha}(x,y)dx dy$$ where $f_{X,Y,\alpha}(x,y)$ is the bivariate normal density function given standard normal variables with correlation $1-\alpha$.
First, show that this probability is increasing as a function of $\alpha$ over the interval $(0,2)$.
Differentiate with respect to $\alpha$ under the integral sign and then integrate to find the derivative with respect to $\alpha$.
The derivative is
$$\frac{d}{d\alpha} \left(\int_{-\infty}^{-\epsilon} \int_{\epsilon}^{\infty}f_{X,Y,\alpha}(x,y)dx dy \right) = \int_{-\infty}^{-\epsilon} \int_{\epsilon}^{\infty} \frac{d}{d\alpha}f_{X,Y,\alpha}(x,y)dx dy$$ $$=\frac{e^{-\frac{\epsilon^2}{\alpha}}}{2 \pi \sqrt{\alpha(2-\alpha)}}$$.
This is always positive.
Hence, the probability is strictly increasing as a function of $\alpha$. The minimum occurs as the limit as $\alpha$ approaches 0. In the limit, $X=Y$. Since $X$ can't be both less than $-\epsilon$ and greater than $\epsilon$ simultaneously, the infimum is $0$. The maximum probability occurs as the limit when $\alpha$ approaches 2. The supremum is when $X=-Y$ and that probability is $P[X<-\epsilon]$.
Now, return to the ratio of this probability divided by $\alpha$. Find the limit as $\alpha$ goes to $0$. Use L'Hospital's rule to deduce that the limit is the same as the limit of the derivative of the numerator as $\alpha$ goes to $0$.
$$\lim_{\alpha \to 0^+} \frac{P[X<-\epsilon,Y>\epsilon]}{\alpha}=\lim_{\alpha \to 0^+} \frac{e^{-\frac{\epsilon^2}{\alpha}}}{2 \pi \sqrt{\alpha(2-\alpha)}}=0$$.
The reason there exists a positive $K$ that works in the case $\epsilon=0$ is in that case $$\lim_{\alpha \to 0^+} \frac{P[X<-\epsilon,Y>\epsilon]}{\alpha}=\infty$$