Why is $\lim\limits_{n\rightarrow\infty}\mathbb{E}(\min(|X_n-X|,1))=0$ an equivalent criterion for convergence in probability?

510 Views Asked by At

Let $X$, $X_1,X_2,...$ be random variables defined on $(\Omega,\mathcal{F},\mathbb{P})$, why is the following bi-conditional statement for convergence in probability true? $$\lim\limits_{n\rightarrow\infty}\mathbb{E}(\min(|X_n-X|,1))=0\Leftrightarrow\forall\epsilon>0,\lim\limits_{n\rightarrow\infty}\mathbb{P}(|X_n-X|\geq\epsilon)=0$$

I tried splitting the expectation into regions $|X_n-X|>1$ and $|X_n-X|\leq1$ and tried to use Markov's inequality but seems to get nowhere. I am thinking this must be quite trivial but I keep getting nowhere with it, so please any help will really be appreciated and greatly needed, thanks in advance.

1

There are 1 best solutions below

7
On BEST ANSWER

"$\Leftarrow$" Use that $$\mathbb{E}\min(|X_n-X|,1) \leq \int_{|X_n-X| < \epsilon} |X_n-X| \, d\mathbb{P} + \int_{|X_n-X| \geq \epsilon} 1 \, d\mathbb{P}.$$

"$\Rightarrow$": Show that $$\mathbb{P}(|X_n-X| \geq \epsilon) = \mathbb{P}(\min(|X_n-X|,1) \geq \epsilon)$$ for all $\epsilon \in (0,1)$ and apply Markov's inequality.