convergence in distribution implies CV in probability ONLY when the limit is constant

82 Views Asked by At

I would like to know why convergence in distribution doesn't imply convergence in probability EXCEPT if the limit is constant.

I can easily see examples of where this holds, and can show that if the limit is constant then the convergences are equivalent.

How do you show that X is constant is the only case where the convergences are equivalent? (If this is true, intuitively I think it is).

Thanks

2

There are 2 best solutions below

1
On

I think you have this backwards. Convergence in probability in fact does imply convergence in distribution. They aren't equivalent because the converse is false. Convergence in distribution does not imply convergence in probability. For example, if $A\subset\Omega$ has probability $1/2$ then let $X_n = I_A$ for $n$ odd and $X_n = 1-I_A$ for $n$ even. ($I_A$ is the function on $\Omega$ that is $1$ for $\omega\in A$ and $0$ otherwise.) Then these converge in distribution since they all have the same distribution, namely $F(x)=0$ if $x\lt 0 $, $F(x) = 1/2$ if $0\le x\lt 1$ and $F(x)=1$ otherwise. They don't converge in probability to anything since no matter how large $n$ gets, we have $|X_{n+1}-X_n|>1/2$ with probability $1$.

Note that technically even the special case you cite, convergence in distribution to a constant (i.e., for some constant $c$, $F(x)=0$ for $x\lt c$ and $F(x)=1$ otherwise), only implies convergence in probability if the $X_n$ are all defined on the same probability space $\Omega$.

0
On

Convergence in probability does imply convergence in distribution.

See here for a proof.

Convergence in distribution does not imply convergence in probability in general.

In the special case $X_n\stackrel{d}{\to}c$ it can be shown that also $X_n\stackrel{p}{\to}c$.

See here for a proof of that.

If $X_n\stackrel{d}{\to}X$ then it might be that $X$ is not defined on the same probability space as the $X_n$. In that case the expression $X_n\stackrel{p}{\to}X$ makes no sense. This problem "dissappears" if $X$ is a constant $c$ because on any probability space we can define a random variable that takes value $c$ for every $\omega\in\Omega$. This already indicates that convergence in distribution can only imply convergence in probability if the limit is a constant.

If the $X_n$ and $X$ are defined on the same probability space then we can choose some $Y$ defined on a different probability space having the same distribution as $X$. Then $X_n\stackrel{d}{\to}X$ implies $X_n\stackrel{d}{\to}Y$ but as stated above $X_n\stackrel{p}{\to}Y$ makes no sense.

This is enough already to conclude that $X_n\stackrel{d}{\to}X$ can only imply $X_n\stackrel{p}{\to}X$ if $X$ is degenerated. So what follows now is actually redundant.

Let $X_n\stackrel{d}{\to}X$ where $X$ is not degenerated and $X_n$ and $X$ are defined on the same probability space.

Now let $X$ and $Y$ be iid.

Then also $X_n\stackrel{d}{\to}Y$.

Because $X$ and $Y$ are iid and not degenerated an $\epsilon>0$ can be found such that $\mathsf P(|X-Y|>2\epsilon)>0$.

Then inequality: $$P(|X-X_n|>\epsilon)+P(|Y-X_n|>\epsilon)\geq\mathsf P(|X-Y|>2\epsilon)$$ tells us that it cannot be true that $X_n$ converges in probability to $X$ and converges in probability to $Y$ as well.

So proved is now that:$$\neg [X_n\stackrel{p}{\to}X]\vee\neg [X_n\stackrel{p}{\to}Y]$$

So whenever $X_n\stackrel{d}{\to}X$ and $X$ is not degenerated we can find a random variable $Y$ that has the same distribution as $X$ and is defined on the same probability space such that $X_n\stackrel{p}{\to}X$ is not true or $X_n\stackrel{p}{\to}Y$ is not true.