Prove that $X_n$ converges to $X$ in probability if and only if the distance converges to zero

288 Views Asked by At

Let $X, Y$ me random variables on $(\Omega, \mathcal{F})$ and define a distance $d(X, Y) = \mathbb{E}[\min\{1, |X - Y|\}$. Prove that $X_n$ converges to $X$ in probability if and only if $d(X_n, X) \rightarrow 0$.

What I did for the $"\Rightarrow"$ part was just using Markov's inequality and we were done: Suppose $d(X_n, X) \rightarrow 0$ and let $\varepsilon > 0$ be arbitrary, then it follows that $$ \mathbb{P}(\min{\{1, |X_n - X|\}} > \varepsilon) \le \frac{\mathbb{E}[\min\{1, |X_n - X|\}]}{\varepsilon} = \frac{d(X_n, X)}{\varepsilon} \rightarrow 0. $$

But I have trouble solving this problem doing the "$\Leftarrow$" part. That is, given that $X_n \rightarrow X$ in probability, then we want to show that $d(X_n, X) \rightarrow 0$. We have currently, for fix $\varepsilon > 0$, that

$$ \mathbb{E}[\min\{1, |X_n - X|\}] \le \int_{|X_n - X| < \varepsilon} |X_n - X| \, d\mathbb{P} + \int_{|X_n - X| \ge \varepsilon} 1 \, d\mathbb{P} $$ since it converges in probability, the right term tends to zero. But how do I deal with the left term?

1

There are 1 best solutions below

0
On BEST ANSWER

$$\int_{|X_n - X| < \varepsilon} |X_n - X| \, d\mathbb{P} \lt \int_{|X_n - X| < \varepsilon} \varepsilon \, d\mathbb{P}\lt \varepsilon.$$