"Let $(X_n)_{n\ge 1}$ be a sequence of uniformly bounded random variables defined on a probability space $(\Omega, \mathscr{F}, P)$. Moreover define $\mathscr{F_0}=\{\emptyset,\Omega\}$ and $\mathscr{F}_n=\sigma(X_1,\ldots,X_n)$ for each $n\ge 1$. Then with probability $1$ it holds $$ \limsup_{n\to \infty} \frac{1}{n}\sum_{m=1}^n X_m=\limsup_{n\to \infty} \frac{1}{n}\sum_{m=1}^n \mathbf{E}[X_m|\mathscr{F}_{m-1}]." $$
I couldn't find the proof of this fact, which is in some old article.. How can we prove it?
This is a form of martingale convergence. Setting $Y_n=X_n-\mathbf{E}[X_n\mid\mathscr{F}_{n-1}]$, we need to show that $$ \frac1n\sum_{m=1}^nY_m=\frac1n\sum_{m=1}^nX_m-\frac1n\sum_{m=1}^n\mathbf{E}[X_m\mid\mathscr{F}_{m-1}]\to0 $$ with probability one. The uniformly bounded hypothesis says that there is an $A > 0$ such that $\lvert X_n\rvert\le A$ for all $n$. In particular, $\mathbf{E}[X_n^2]\le A^2$ (which is all we really need), and this implies that $\mathbf{E}[Y_n^2]\le A^2$. Also, $\mathbf{E}[Y_n\mid\mathscr{F}_{n-1}]=0$ so the process $M_n=\sum_{m=1}^nY_m/m$ is a martingale. That is, $\mathbf{E}[M_n\mid\mathscr{F}_{n-1}]=M_{n-1}$. It is also $L^2$-bounded, $$ \mathbf{E}[M_n^2]=\sum_{m=1}^n\mathbf{E}[Y_m^2/m^2]\le\sum_{m=1}^nA^2/m^2\le\frac{A^2\pi^2}6. $$ Now, Doob's martingale convergence theorem says that $\sum_{m=1}^nY_m/m=M_n$ converges to a limit in $\mathbb{R}$ with probability one. Kronecker's lemma then gives $$ \frac1n\sum_{m=1}^nY_m=\frac1n\sum_{m=1}^nm(Y_m/m)\to0 $$ with probability one.