I believe that as long as $X,Y$ have zero mean, then regardless of what their variance is, and for any constant $c$, $P(X > cY)$ should always be $0.5$. It's easier to visualize if you think about the bivariate distribution of $X$ and $Y$ projected onto the $xy$-plane. It'll be either a perfect circle if the two variables have the same variance, or an ellipse with differing variances. Any line with any slope that passes through the origin will cut the area of the bivariate pdf in half, hence the probability of 0.5.
Is this correct?
Is this true regardless if $X,Y$ are independent or not?
Ignoring extreme cases such as variances of $0$ or correlation of $\pm1$, it is true that $P(X > cY) =\frac12$ if $X$ and $Y$ have a joint bivariate normal distribution, even if they are correlated. You get something like this, and by symmetry half the distribution is above the red line.
So any counterexample needs to break the symmetry. Here is an example using R, where only about $37\%$ of the joint distribution is above the red line, achieved by twisting points on the left hand side below the line without changing the marginal distributions.
and to check the two distributions are indeed normally distributed, the simulated empirical densities for $X$ (in blue) and $Y$ (in red) are shown over the theoretical density (in black)