To prove some convergence results for random vectors in $\mathbb{R}^k$ (Theorem 2.7, Asymptotic Statistics), Van der Vaart uses the following claim:
"For every $f$ with a range $[0, 1]$ and Lipschitz norm at most $1$ and every $\epsilon >0$, $$|Ef(X_n) - Ef(Y_n)|\le \epsilon E1_{\{d(X_n, Y_n)\le\epsilon \}} + 2E1_{\{d(X_n, Y_n)>\epsilon \}}."$$ Where $d(x, y)$ is a distance function on $\mathbb{R}^k$.
I could not figure out a proof for this claim, nor find any reference for it. Is there any quick way of showing this result?
Expectation is linear, so one has $$|Ef(X_n) - Ef(Y_n)| = |E(f(X_n) - f(Y_n))|$$ $$= \bigg|\int_X (f(X_n) - f(Y_n))\bigg|$$ Here $(X,\mu)$ denotes the probability space. Now break the domain of integration into $d(X_n,Y_n) \leq \epsilon$ and $d(X_n,Y_n) > \epsilon$ parts. The integral can then be written as $$\int_{\{x \in X:\,d(X_n,Y_n)\leq \epsilon\}} (f(X_n) - f(Y_n))\,d\mu + \int_{\{x \in X:\,d(X_n,Y_n)> \epsilon\}} (f(X_n) - f(Y_n))\,d\mu$$ Use the Lipschitz condition on the first integral, and the fact that $|f| \leq 1$ on the second integral, to get the bound you want.