The convergence modes for random variables are mostly analogous to those in real analysis:
- convergence in distribution $\leftrightarrow$ weak convergence
- convergence in probability $\leftrightarrow$ (unnamed? I've seen it called convergence in $L^0$)
- convergence a.s. $\leftrightarrow$ pointwise a.e. convergence
- convergence in $L^p$ $\leftrightarrow$ convergence in $L^p$ (assume $1 \leq p < \infty$)
In real analysis we emphasize pointwise convergence vs. uniform convergence, so I was wondering if the concept of uniform convergence also carries over into probability theory. The definition of convergence in $L^\infty$ seems to be the right one as it allows one to exclude sets of measure zero:
$$||f||_\infty := \inf\{C \geq 0 : |f(x)| \leq C \text{ for a.e. }x\}$$ $$X_t \overset{L^\infty}\to X \quad\text{if}\quad\lim_{t\to\infty} ||X_t-X||_\infty = 0.$$
Is the concept of $L^\infty$ convergence used in probability theory?