Let $X$, $Y$ be random variables. Assume that the cumulative distribution function $F$ of $X$ and $Y$ satisfy the following relation: $$F_X(x) > F_Y(x) \hspace{2mm} \forall x$$
Is it possible to conclude that $E[X] > E[Y]$?
My idea: Let $f_X(x), f_Y(x)$ respectivly be the probability density function of $F_X(x), F_Y(x)$. Then
$ F_X(x) > F_Y(x) \hspace{2mm} \forall x \Rightarrow \int_{-\infty}^x f_X(x) > \int_{-\infty}^x f_Y(x) \hspace{2mm} \forall x \Rightarrow f_X(x) > f_Y(x) \hspace{2mm} \forall x\Rightarrow E[X] > E[Y]$
Assuming the existence of $\mathbb{E}[X]$ and $\mathbb{E}[Y]$, we can conclude that $\mathbb{E}[X] < \mathbb{E}[Y]$. An intuitive explanation is that $F_X(x) > F_Y(x)$ can be read as follows: $X \leq x$ will more likely to happen than $Y \leq x$.
Indeed, this easily follows once we prove the following formula
$$ \mathbb{E}[X] = \int_{\mathbb{R}} \left( \mathbf{1}_{[0,\infty)}(x) - F_X(x) \right) \, \mathrm{d}x. $$
Assuming this, we have
$$ \mathbb{E}[Y] - \mathbb{E}[X] = \int_{\mathbb{R}} \left( F_X(x) - F_Y(x) \right) \, \mathrm{d}x > 0, $$
where the last step follows from the fact that we are integrating positive function. So it remains to prove the formula. To this end, it is convenient to write $X = X^+ - X^-$, where $X^+ = \max\{X, 0\}$ is the positive part of $X$ and $X^- = \max\{-X, 0\}$ is the negative part of $X$. Then for each part,
$$ \mathbb{E}[X^+] = \mathbb{E}\left[ \int_{0}^{\infty} \mathbf{1}_{\{ x < X \}} \, \mathrm{d}x \right] = \int_{0}^{\infty} \mathbf{E}\left[ \mathbf{1}_{\{ x < X \}} \right] \, \mathrm{d}x = \int_{0}^{\infty} \left( 1 - F_X(x) \right) \, \mathrm{d}x $$
and
$$ \mathbb{E}[X^-] = \int_{0}^{\infty} \mathbf{E}\left[ \mathbf{1}_{\{ x \leq -X \}} \right] \, \mathrm{d}x = \int_{-\infty}^{0} \mathbf{E}\left[ \mathbf{1}_{\{ X \leq x \}} \right] \, \mathrm{d}x = \int_{-\infty}^{0} F_X(x) \, \mathrm{d}x. $$