What is an example of a sequence of random variables $\{X_n\}$ on a probability space $(\Omega, \mathscr{F}, P)$ such that $E\left(X_n^2\right) \to 0$ but it is not the case that $$ \frac{S_n - E(S_n)}{n} \to 0 $$ almost everywhere. There is no assumption of independence of $\{X_n\}$.
See Chung's A Course in Probability Theory, problem 5.1.1, for reference.
Attempts
Take $P$ to be Lebesgue measure on the Borel sets $(0,1]$. Let $X_1 = 1$ on $(0,1]$, $X_2 = 1$ on $(0,0.5]$, $X_3 = 1$ on $(0.5,1]$, $X_4 = 1$ on $(0,1/3]$, and so on. This is an example to show convergence in probability does not imply convergence almost everywhere. This fails.
Take $X_n = 1$ on $(0,1/n]$. This fails.
Some observations. Suppose that $\mathbb E\left[X_n^2\right]\to 0$.
Consider your favorite sequence $\left(Y_i\right)_{i\geqslant 1}$ of non-negative random variables which converges to zero in $\mathbb L^2$ but not almost everywhere. For $2^N\leqslant n\leqslant 2^{N+1}-1$, define $X_n=Y_N$. Then $$\frac 1{2^{N+1}}S_{2^{N+1}} \geqslant\frac 1{2^{N+1}}\sum_{i=2^N}^{2^{N+1}-1}X_i=\frac 12 Y_N. $$