I'm trying to understand this argument on convergence of a sequence of functions from a research paper (the function is random but we can make the following argument for each sample path).
Consider a sequence of functions $f^{r}(t):[0,\infty)\rightarrow[0,\infty)$ such that for each $r$, $f^{r}(t)$ is nondecreasing in $t$, $$ f^{r}(s)-f^{r}(t)\leq(s-t),\quad t\leq s, $$ and $f^{r}(t)\leq t$. Then it is stated that since the above holds we can conclude that for each sequence there exists a subsequence say $\{r_{j}\}$ such that $$ f^{r_{j}}(t)\rightarrow f(t),\quad \mbox{ u.o.c. as } \quad j\rightarrow\infty, $$ for some function $f(t)$, where u.o.c. stands for uniformly on compact sets.
Questions: (1) Does the above follow using the Arzelà - Ascoli theorem and the fact that $f^{r}$ is Lipschitz and bounded on any compact set $[0,T]$?
(2) Given that any sequence has a convergent subsequence can we then conclude that $f^{r}(t)\rightarrow f(t)$ $u.o.c.$ as $r\rightarrow\infty$?
Thanks for elaborating!
You're set up to use Arzela-Ascoli. You have uniform boundedness on each $[0,T]$ and global equicontinuity. So A-A gives a subsequence that converges uniformly on $[0,1],$ then a subsequence of that which converges uniformly on $[0,2],$ etc. Then diagonalize to obtain a final subsequence that CUOCS.
The answer to 2. is no. Example: $0,t, 0, t, 0, t, \dots.$