Suppose $X$ and $Y$ are two random variable. Let X>Y stochastically and $g$ is an strictly increasing function. Is it true that E(g(X))>E(g(Y)) strictly and how to prove it?
By $X>Y$ stochastically I mean $P(X>t)\geq P(Y>t)$ for all $t$ and $P(X>t)> P(Y>t)$ for some $t$. Thanks for your help.
Another way to understand $X>Y$ stochastically, is that it is equivalent to there existing a coupling between $X$ and $Y$, say random variable $u$, such that $X(u)>Y(u)$ for all values of $u$ (except on a set of probability zero) and $X(u)$ is identical to $X$ in distribution (same for $Y$).
If $g$ is strictly increasing, then $E_X(g(X))=E_u(g(X(u)))>E_u(g(Y(u)))=E_Y(g(Y))$.
This is not very precise, but it maybe could be fleshed out to be more rigorous.