Is convergence in probability sometimes equivalent to almost sure convergence?

1.2k Views Asked by At

I was reading on sufficient and necessary conditions for the strong law of large numbers on this encyclopedia of math page, and I came across the following curious passage:

The existence of such examples is not at all obvious at first sight. The reason is that even though, in general, convergence in probability is weaker than convergence with probability one, nevertheless the two types of convergence are equivalent, for example, in the case of series of independent random variables.

("Such examples" in the above refers to an example of a sequence $(X_k:k\in\mathbb N)$ where the weak law of large numbers holds but the strong law fails, due to some dependence).

This really surprised me, as I have never heard of this claim, and unfortunately, there is no reference for it on the website. I was not able to find a reference for it with a google search or in the textbooks that I have either.

If this is indeed true, I would be very interested in a reference, or a counterexample if it is mistaken.

3

There are 3 best solutions below

0
On BEST ANSWER

See https://en.wikipedia.org/wiki/Convergence_of_random_variables#Properties_4 (third property starting from the end of the paragraph).

You may be confused because you understood "series" as "sequences".

0
On

If $X_n$ is a sequence of independent random variables that converges in probability to a random variable $X$, then for any $\varepsilon\in(0,1)$, we may choose $N$ such that $n\geqslant N$ implies $$\mathbb P(|X_n-X|<\varepsilon)\geqslant 1-\varepsilon. $$ Set $E_n = \{\omega : |X_n(\omega)-X(\omega)|<\varepsilon\}$. Then the $E_n$ are independent and $$\sum_{n=1}^\infty \mathbb P(E_n) \geqslant \sum_{n=N}^\infty 1-\varepsilon=\infty, $$ so by the second Borel-Cantelli lemma, $\mathbb P\left(\limsup_{n\to\infty} E_n\right) = 1$.

But without additional assumptions, we cannot conclude that $\mathbb P\left(\liminf_{n\to\infty}E_n\right) =1$ (i.e. that $X_n\stackrel{a.s.}\longrightarrow X$).

0
On

Let $(X_k:k\in\mathbb N)$ be independent random variables defined on some probability space $(\Omega,\mathscr A,P)$, and for each $n\in\mathbb N$, let $S_n=X_1+\cdots+X_n$. Suppose that $S_n\to X$ in probability.


For every $\epsilon>0$, define the event \begin{align*} C(\epsilon)=\bigcap_{n\in\mathbb N}\left[\sup_{j,l\geq n}|S_j-S_l|>2\epsilon\right]. \end{align*} If $\omega\in\Omega$ is such that $S_n(\omega)$ diverges, that is, $(S_n(\omega):n\in\mathbb N)$ is not a Cauchy sequence, then $\omega$ must be contained in the union \begin{align*} \bigcup_{\epsilon\in\mathbb Q\cap(0,\infty)}C(\epsilon). \end{align*} Therefore, if we show that $C(\epsilon)$ has a probability of $0$ for every $\epsilon>0$, the theorem will follow by countable subadditivity.


Since $$|S_j(\omega)-S_l(\omega)| \leq|S_{n+k}(\omega)-S_n(\omega)|+|S_{n+k'}(\omega)-S_n(\omega)|$$ with $j=n+k$ and $l=n+k'$, it follows from monotonicity and continuity from above of probability measures that \begin{align}\tag{1} \Pr\big(C(\epsilon)\big) =\lim_{n\to\infty} \Pr\left[\sup_{j,l\geq n}|S_j-S_l|>2\epsilon\right] \leq\lim_{n\to\infty} \Pr\left[\sup_{k\geq 1}|S_{n+k}-S_n|>\epsilon\right]. \end{align} So we need only prove that the right-hand side of $(1)$ converges to zero.


Notice that, for every $\epsilon>0$ and $k\geq1$, $$\Pr[|S_{n+k}-S_n|\geq\epsilon] \leq\Pr[|S_{n+k}-X|\geq\epsilon/2]+\Pr[|S_n-X|\geq\epsilon/2],$$ which means that \begin{align} 0=\lim_{n\to\infty}\Pr[|S_{n+k}-S_n|\geq\epsilon] =\lim_{n\to\infty}\sup_{k\geq 1}\Pr[|S_{n+k}-S_n|\geq\epsilon]. \end{align} The result then follows from moving the $\sup$ inside the $\Pr$, which is given by an application of Etemadi's Inequality.