Let X and Y be two random variables in $\mathbb{R}$. I'm looking for statements of the form: $E(X)-E(Y) \leq D(X,Y)$, where D() is a certain metric (e.g., total variation, KL-divergence, etc.) Not hoping for anything in full generality, but curious what such statements may look like.
For instance, if X and Y are random variables taking values over the same finite set in $\mathbb{R}$, then it's easy to show that there exists $c>0$ such that $|E(X-Y)|\leq c\cdot D_{TV}(X,Y)$ where $D_{TV}$ is the total variational distance. Are there more general statements of this kind, e.g., for continuous random variables with unbounded support?
As you consider KL-divergence then it's normal if the metric is defined not for all r.v.
$EX - EY \le \rho(X,Y)$ where $\rho(X,Y) = E|X-Y|$.
Moreover, $EX - EY \le \rho_1 (X,Y) \le \rho_p(X,Y)$ by Lyapunov's inequality where $\rho_p(X,Y) = (E|X-Y|^p)^{\frac1{p}}$.