Consider a constant $k$ and a martingale $\{ X_n, n \ge 1 \}$ s.t. $EX_1^2 \le k$ and $\forall n \ge 2$, $E(X_n - X_{n - 1})^2 \le k$.
Let $\lambda$ denote $E|X_1|$. By Chebyshev's inequality, We can prove that $$\forall \epsilon > 0, \lim_{n \to \infty} P\left( \left| \frac{X_n - \lambda}{n} \right| > \epsilon \right) = 0$$
However, do we have $$\forall \epsilon > 0, \lim_{n \to \infty} P \left( \left| \frac{X_n}{n} \right| > \epsilon \right) = 0?$$
It seems intuitional at the first sight for me, but I don't know how to prove or falsify it.
The detailed proof of the first formula is as follows:
For any $1 \le n \le m$, we have $$E(X_n X_m) = E(E(X_n X_m | X_0, \cdots, X_n)) = E(X_n E(X_m | X_0, \cdots, X_n)) = E(X_n^2).$$
Thus, $$E((X_n - X_{n - 1})(X_m - X_{m - 1})) = E(X_n^2) - E(X_n^2) - E(X_{n - 1}^2) + E(X_{m - 1}^2) = 0$$
Therefore, $$\mathrm{var} X_n = E[X_n^2] - E^2[X_n] = \sum_{i = 2}^n E(X_i - X_{i - 1})^2 + \sum_{2 \le i \not= j \le n} E(X_i - X_{i - 1})(X_j - X_{j - 1}) - \lambda^2 \le nk - \lambda^2$$
By Chebishev's inequality, we have $$P(|X_n - \lambda| > n \epsilon) \le \frac{nk - \lambda^2}{n^2 \epsilon^2}.$$
Thus, $$0 \le \lim_{n \to \infty} P\left( \left| \frac{X_n - \lambda}{n} \right| > \epsilon \right) \le \lim_{n \to \infty} \frac{nk - \lambda^2}{n^2 \epsilon^2} = 0$$
Use Markov's inequality: $$P(|X_n/n|>\varepsilon)=P(|X_n/n|^2>\varepsilon^2)\leq \varepsilon^{-2}\frac{E[X_n^2]}{n^2}$$ and (set $X_0=0$) $$E[X_n^2]=\sum_{1\leq k\leq n}E[(X_k-X_{k-1})^2]+\sum_{1\leq k\neq h\leq n}E[(X_k-X_{k-1})(X_h-X_{h-1})]\leq nk$$