Convergence in probability with $\limsup$ and $\liminf$ constant implies a.s. convergence?

279 Views Asked by At

Let $X_1,X_2,\cdots$ be random variables. Given that $\limsup X_n$ and $\liminf X_n$ are both constant a.s., can we obtain

$$ X_n \xrightarrow{\text{ Prob. }} X \implies X_n \xrightarrow{\text{ a.s. }} X \ ? $$

This question was inspired by LLN, for if $\xi_1,\xi_2,\cdots$ are i.i.d. then $\limsup \frac{\xi_1 +\cdots+ \xi_n}{n}$ and $\liminf \frac{\xi_1 +\cdots+ \xi_n}{n}$ are tail random variables and thus are constant a.s..

1

There are 1 best solutions below

2
On

Here is a simple counterexample. Let $(\Omega, \mathcal{F}, \mathbb{P})$ be our probability space where $\Omega = [0,1]$, $\mathcal{F}$ is the Borel $\sigma$-algebra of $[0,1]$ and $\mathbb{P}$ is the Lebesgue measure on $[0,1]$. For $k\in \mathbb{N}$ and $i=1,2,...,2^k$ consider the (dyadic) interval $$ \Delta_k^i:= \left[ \frac{i-1}{2^k} , \frac{i}{2^k} \right], $$ and let $X_k^i$ be the indicator function of $\Delta_k^i$. Now enumerate the sequence $\{\{2^k + i\}_{i=1}^{2^k}\}_{k=1}^{\infty}$ by $n=1,2,...$ and with this enumeration let $X_n$ be the $X_k^i$, our sequence of random variables.

It is easy to see that $$ \tag{1} \limsup X_n = 1 \ \ \text{ a.s. and } \ \ \liminf X_n = 0 \ \ \text{ a.s. on } \ \ [0,1]. $$ Also, since $$\mathbb{P}(X_n = 1) = \mathbb{P}(X_k^i = 1) = 2^{-k} \to 0 $$ it follows that $X_n $ converges to $0$ in probability. However $(1)$ shows that $X_n$ does not converge a.s.