Problem about convergence in Probability (2)

132 Views Asked by At

Let $X_1,X_2,\dots$ be a sequence of random variables with $$ \lim_{n\rightarrow+\infty}E\left[\left|X_n\right|\right]=0 $$ Is it true or false that the sequence $X_n$ must converge to $0$ in probability?

If true, prove it. If false, provide a counter example.

Thank very much

2

There are 2 best solutions below

0
On

Hint: Just apply Markov Inequality.

0
On

The convergence in probability is defined as

$$ \forall \epsilon > 0, \lim_{n\to\infty} \Bbb{P}(\left| X_n - X \right| < \epsilon) = 1. $$

By noting that

$$ \epsilon \Bbb{P}( \left| X - Y \right| \geq \epsilon) \leq \Bbb{E} [ \left| X - Y \right| \wedge 1 ] \leq \epsilon + \Bbb{P}( \left| X - Y \right| \geq \epsilon) $$

for any small $\epsilon > 0$, we obtain

$$ X_n \xrightarrow{p} X \quad \Longleftrightarrow \quad \lim_{n\to\infty} \Bbb{E} [ \left| X_n - X \right| \wedge 1 ] = 0. $$

Plugging $X = 0$, we have

$$ \Bbb{E} [ \left| X_n \right| \wedge 1 ] \leq \Bbb{E} \left| X_n \right| \xrightarrow{n\to\infty} 0 $$

and therefore $X_n$ converges to $0$ in probability.

This proves that

  1. Convergence in probability is metrizable with the metric $d(X, Y) = \Bbb{E} [|X - Y| \wedge 1]$, and
  2. $L^1$-convergence implies convergence in probability.