Convergence in Probability does not imply Convergence in Quadratic Mean: Counterexample

462 Views Asked by At

The following counterexample should show that convergence in probability does not imply convergence in quadratic mean: $$X_i=\begin{cases}0, & \text{with probability } 1-\frac{1}{i}\\ i, & \text{with probability } \frac{1}{i} \end{cases}$$

I am having trouble proving the seemingly simple, that this random variable converges in probability to 0. Using Markov's inequality:

$$P(|X_i-0|>\varepsilon)\leq \frac{1}{\varepsilon}E[|X_i|] =\frac{1}{\varepsilon}\big(0\cdot \big(1-\frac{1}{i}\big) + i \cdot \frac{1}{i}\big)=\frac{1}{\varepsilon}$$

Taking the limit as $i\to \infty$ does not yield 0 as expected. What am I doing wrong?

I am fairly confident I have correctly shown $X_i$ does not converge in quadratic mean. Can you please check my work?

$$\lim_{i\to\infty} E[|X_i-0|^2]=\lim_{i\to\infty} E[X_i^2]=\lim_{i\to\infty} 0^2\cdot \big(1-\frac{1}{i}\big) + i^2 \cdot \frac{1}{i} = \infty$$

2

There are 2 best solutions below

1
On

To show convergence in probability, you don't need Markov's inequality. Just look at the definition of $X_i$: $$P(|X_i-0|>\epsilon)\le P(X_i \ne 0)$$ since the event on the left implies the event on the right, and the event on the right has an easy probability.

As for your second proof, it's correct provided you change the first part to $$\lim_{i\to\infty} E(|X_i-0|^2) $$

0
On

If $X$ denotes a constant (or degenerated) random variable then: $$X_i\stackrel{d}{\to}X\iff X_i\stackrel{p}{\to} X$$ so in order to prove that $X_i\stackrel{p}{\to} 0$ it is enough to show that $F_{X_i}(x)$ converges to $1$ if $x>0$ and converges to $0$ if $x<0$ (which is quite easy).