Proving a result on Lipschitz functions

127 Views Asked by At

To prove some convergence results for random vectors in $\mathbb{R}^k$ (Theorem 2.7, Asymptotic Statistics), Van der Vaart uses the following claim:

"For every $f$ with a range $[0, 1]$ and Lipschitz norm at most $1$ and every $\epsilon >0$, $$|Ef(X_n) - Ef(Y_n)|\le \epsilon E1_{\{d(X_n, Y_n)\le\epsilon \}} + 2E1_{\{d(X_n, Y_n)>\epsilon \}}."$$ Where $d(x, y)$ is a distance function on $\mathbb{R}^k$.

I could not figure out a proof for this claim, nor find any reference for it. Is there any quick way of showing this result?

2

There are 2 best solutions below

2
On BEST ANSWER

Expectation is linear, so one has $$|Ef(X_n) - Ef(Y_n)| = |E(f(X_n) - f(Y_n))|$$ $$= \bigg|\int_X (f(X_n) - f(Y_n))\bigg|$$ Here $(X,\mu)$ denotes the probability space. Now break the domain of integration into $d(X_n,Y_n) \leq \epsilon$ and $d(X_n,Y_n) > \epsilon$ parts. The integral can then be written as $$\int_{\{x \in X:\,d(X_n,Y_n)\leq \epsilon\}} (f(X_n) - f(Y_n))\,d\mu + \int_{\{x \in X:\,d(X_n,Y_n)> \epsilon\}} (f(X_n) - f(Y_n))\,d\mu$$ Use the Lipschitz condition on the first integral, and the fact that $|f| \leq 1$ on the second integral, to get the bound you want.

0
On

Here is a suggestion: for $n\in\mathbb{N}$ and $\epsilon>0$, we have that

$$\mathbb{E}[|f(X_n)-f(Y_n)|]=\mathbb{E}[|f(X_n)-f(Y_n)|1_{d(X_n,Y_n)>\epsilon}]+\mathbb{E}[|f(X_n)-f(Y_n)|1_{d(X_n,Y_n)\leq\epsilon}]$$

Then we can use that $f\leq1$, along with the triangle inequality, to bound the first term and the Lipschitz condition (i.e. $|f(x)-f(y)|\leq d(x,y)$ for $x,y\in\mathbb{R}^k$) to bound the second term on the right hand side.