Let $Q_0$ and $Q_1$ be distributions on $\mathbb{R}$ with distribution function $G_0$ and $G_1$ respectively. I am trying to show that
- $G_0(y) \geq G_1(y), y \in \mathbb{R}$
- $G_0(y-) \geq G_1(y-), y \in \mathbb{R}$
- $\int h dQ_0 \leq \int h dQ_1 \: \: \: \text{for any non decreasing function} \: \: \: h: \mathbb{R} \to [0,\infty)$
these three are equivalent.
My idea was to use: $Z \geq 0$
$$\mathbb{E}(Z)=\int_{0}^{\infty}\mathbb{P}(Z \geq t)dt=\int_{0}^{\infty}\mathbb{P}(Z > t)dt$$
Any idea to proceed would be appreciated.
1) implies 2) follows by taking limits. 2) implies 1) follows by noting that distribution functions are continuous except at countable many points, so we get $G_0(y) \geq G_1 (y)$ at all but countable many points; but distribution functions are also right-continuous at all points so we get 1). 3) implies 1) follows by taking $h=I_{(y,\infty)}$. The tricky part is 1) implies 3). There are several approaches but I think the following is an elegant one. Consider $(0,1)$ as a probability space by providing it with Borel sigma algebra and Lebesgue measure. Let $X(\omega)=\inf \{t:G_0(\omega) \geq t\}$. This is the so-called generalized inverse of $G_0$. One check easily that $X(\omega) \leq t$ if and only if $G_0(\omega) \geq t$ and this implies that $G_0$ is the distribution function of $X$. Similarly let $Y(\omega)=\inf \{t:G_1(\omega) \geq t\}$ so $G_1$ is the distribution function of $Y$. By 1) it follows that $X(\omega) \leq Y(\omega)$ for all $\omega$. Hence $h(X(\omega)) \leq h(Y(\omega))$ for all $\omega$. Taking expectation we get 3)..