Convergence in distribution doesn't imply convergence in probability

63 Views Asked by At

I know that convergence in distribution does not imply convergence in probability. However, I saw these two lemmas;

  • Lemma 1: $X_n \xrightarrow{p} X \implies X_n \xrightarrow{d} X$.

  • Lemma 2: $X_n \xrightarrow{d} c \implies X_n \xrightarrow{p} c$ if $c$ is constant.

So, using these we have:

$$X_n \xrightarrow{d} X$$ $$ \implies X_n - X \xrightarrow{d} 0$$ $$ \implies X_n - X \xrightarrow{p} 0$$ $$ \implies X_n \xrightarrow{p} X$$

I know that the above argument doesn't hold, but I don't know in which step. Can someone explain where this went wrong and give an example showing why this argument doesn't hold?

1

There are 1 best solutions below

1
On BEST ANSWER

The first implication is wrong. Suppose e.g. that the $X_n$ and $X$ are i.i.d. standard normals. Then $X_n \xrightarrow{d} X$, since they are all the same distribution, but $X_n - X \not\xrightarrow{d} 0$.