Consider two normal distributions with the same variance $\sigma^2$, one with mean 0 and one with mean $\delta > 0$. Let $F_1$ be the CDF of the zero mean distribution and $F_2$ that of the other distribution ($F_2(x) = F_1(x-\delta)$).
Prove that for any $u_1 < v_1 < u_2 < v_2$ (think two disjoint intervals $(u_1,v_1)$ and $(u_2,v_2)$ with the latter lying on the 'right')
$$(F_2(v_2)-F_2(u_2))\cdot (F_1(v_1)-F_1(u_1)) > (F_2(v_1)-F_2(u_1))\cdot (F_1(v_2)-F_1(u_2)) $$
or equivalently, $$(F_1(v_2-\delta)-F_1(u_2-\delta))\cdot (F_1(v_1)-F_1(u_1)) > (F_1(v_1-\delta)-F_1(u_1-\delta))\cdot (F_1(v_2)-F_1(u_2)) $$
Intuitively, this is saying that given two normal distributions with the same variance and two disjoint intervals; the likelihood of a sample from the distribution with the larger mean 'landing' in the 'right' interval and a sample from the distribution with the smaller mean 'landing' in the 'left' interval is higher than the other way around.
I've written some codes to try to verify this and it seems to be true. However, I haven't figured out how to mathematically prove this. Any pointers are greatly appreciated! May be it has something to do with the normal cdf being log-concave?
The trick is to look at derivatives. Let me assume for simplicity (but without loss of generality) that $\sigma=1$ and let $\varphi$ denote the density of a $N(0,1)$. Then, for all $t_1<t_2$, \begin{align*} \ln(\varphi(t_2 -\delta))-\ln(\varphi(t_1 -\delta))& =-\frac{1}{2}\left[(t_2-\delta)^2 - (t_1-\delta)^2\right]\\ & = -\frac{1}{2}\left[t^2-t_1^2 - 2\delta(t_2-t_1)\right]\\ & > \ln(\varphi(t_2))-\ln(\varphi(t_1)). \end{align*} Therefore, for all $t_1<t_2$, $$\varphi(t_2 - \delta)\varphi(t_1)> \varphi(t_1 - \delta)\varphi(t_2).$$ Integrating $t_2$ over $[u_2,v_2]$ ($u_2>t_1$), we obtain $$\left[F_1(v_2 - \delta)-F_1(u_2 - \delta)\right]\varphi(t_1)> \varphi(t_1 - \delta)\left[F_1(v_2)-F_1(u_2)\right].$$ The result follows by integrating $t_1$ over $[u_1,v_1]$ (with $v_1<u_2$).
Edit: you were right, my first inequality is just log-concavity of the normal distribution.