Interpreting almost sure convergence

560 Views Asked by At

I'm reading: https://en.wikipedia.org/wiki/Convergence_of_random_variables#Almost_sure_convergence and here it says that

Given a probability space $(\Omega,\mathcal{F},P)$ and a random variable $X:\Omega \rightarrow \mathbb{R}$ almost sure convergence stands for $$P\left(\omega \in \Omega: \lim_{n \rightarrow \infty} X_n(\omega)=X\right)=1.$$ [...] almost sure convergence can also be defined as follows: $$P\left(\limsup_{n \rightarrow \infty} \left\{\omega \in \Omega: |X_n(\omega) - X(\omega)| > \varepsilon\right\}\right)=0, \quad \forall \; \varepsilon>0.$$

My question is, what is the intuition behind this equivalence? I understand the first definition, but why do we use $\limsup$ in the second one to make the equivalence work? Thanks

2

There are 2 best solutions below

1
On BEST ANSWER

I don't really see intuition here, the equivalence just follows from using the definition of convergence. For a sequence of sets $(A_n)$ the set $\lim \sup(A_n)=\{A_n\ \ i.o\}$ is the set of elements which belong to infinitely many of the sets $A_n$. The formal definition of this set is $\cap_{n=1}^\infty \cup_{k=n}^\infty A_k$.

Assume $X_n\to X$ almost surely by the first definition and let any constant $\epsilon>0$. Define the sequence $A_{n,\epsilon}:=\{\omega: |X_n(\omega)-X(\omega)|>\epsilon\}$. Note that if $\omega\in\lim\sup A_{n,\epsilon}$ then it means that $|X_n(\omega)-X(\omega)|>\epsilon$ for infinitely many values of $n$, and hence $X_n(\omega)$ obviously does not converge to $X(\omega)$. So $\lim\sup A_{n,\epsilon}\subseteq \{\omega: X_n(\omega)\nrightarrow X(\omega)\}$, and by monotonicity of probability:

$\mathbb{P}(\lim\sup A_{n,\epsilon})\leq \mathbb{P}(\{\omega: X_n(\omega)\nrightarrow X(\omega)\})=0$

Second direction: Now assume $X_n\to X$ by the second definition. For each $k\in\mathbb{N}$ define $B_k=\lim\sup A_{n,\frac{1}{k}}$ where the sets $A_{n,\epsilon}$ are defined like before. Then by assumption $\mathbb{P}(B_k)=0$ for all $k$, and hence $\mathbb{P}(\cup_{k=1}^\infty B_k)=0$. Now suppose we have $X_n(\omega)\nrightarrow X(\omega)$ for some $\omega$. This implies that there must be some $m\in\mathbb{N}$ such that $|X_n(\omega)-X(\omega)|>\frac{1}{m}$ for infinitely many natural numbers $n$, and thus $\omega\in B_m\subseteq\cup_{k=1}^\infty B_k$.

In other words, we have the inclusion $\{\omega: X_n(\omega)\nrightarrow X(\omega)\}\subseteq\cup_{k=1}^\infty B_k$, and so $\mathbb{P}(\{\omega: X_n(\omega)\nrightarrow X(\omega)\})=0$.

0
On

Intuition

There is not much intuition to be gleaned here. The second definition comes from "massaging" the definition of the [non-random] limit of real numbers (since for a fixed $\omega$, the limit $\lim_{n \to \infty} X_n(\omega)$ is just a non-random limit).

The utility of the second definition is that it is easier to verify because it involves relatively simple sets $\{|X_n(\omega) - X(\omega)| > \epsilon\}$ (fixed $\epsilon$, fixed $n$). You only need to deal with one $n$ at a time to understand this set, and under certain circumstances, bounding the probability of this set for each $n$ can be enough to bound probability of the $\limsup$. By contrast, the set $\{\lim_{n \to \infty} X_n(\omega) = X(\omega)\}$ is difficult to deal with because of the limit inside the event.


Notation

Let $A_{n, \epsilon} = \{|X_n(\omega) - X(\omega)| > \epsilon\}$. Note that $$\limsup_{n \to \infty} A_{n, \epsilon} := \bigcap_n \bigcup_{k \ge n} A_{k,\epsilon}$$ by definition.


(1) $\implies$ (2)

Fix $\epsilon > 0$. If $\omega \in \bigcap_n \bigcup_{k \ge n} A_{k, \epsilon}$, then $|X_n(\omega) - X(\omega)| > \epsilon$ for infinitely many $n$, so $\lim_n X_n(\omega) \ne X(\omega)$. Thus $$P(\limsup_n A_{n, \epsilon}) \le P(\lim_n X_n(\omega) \ne X(\omega))$$ for each $\epsilon$. So if almost sure convergence holds in the sense of the first definition, then it holds in the sense of the second definition.


(2) $\implies$ (1)

Conversely, suppose $\omega$ is such that $\lim_n X_n(\omega) \ne X(\omega)$. If you write out the definition of a limit, this means there exists some $\epsilon$ such that $|X_n(\omega) - X(\omega)| > \epsilon$ for infinitely many $n$. That is, there exists $\epsilon$ such that $\omega \in \bigcap_n \bigcup_{k \ge n} A_{k, \epsilon}$. Then $$P(\limsup_n A_{n, \epsilon}) \ge P(\lim_n X_n(\omega) \ne X(\omega))$$ for this particular $\epsilon$. So if almost sure convergence holds in the sense of the second definition, it also holds in the sense of the first definition.