When does almost sure pointwise convergnce of a sequence of stochastic processes imply uniform almost sure convergence

200 Views Asked by At

Suppose $\{X_n(t) \; : \; t \in K\}$ $n=1,2,...$ is a sequence of stochastic processes indexed by a compact subset of a metric space $K$, and $\{X_\infty(t) \; : \; t\in K\}$ is a limiting process satisfying $X_n(t) \stackrel{a.s.}{\to} X_\infty(t)$ for all $t \in K$. My question is: under what conditions is this enough to imply that $\sup_{t \in K} |X_n(t) -X_\infty(t) | \stackrel{a.s.}{\to} 0$? I assume this must depend on the continuity of each process, and perhaps on the complexity of $K$, but I cannot find a suitable general result of this type.