Random variables $X_n$ are independent, but not identically distributed. $\text{Var}(X_n) < \infty$ for all $n$, $\text{Var}(S_n) = o(n^2)$ when $n \to \infty.$ Prove $\frac{S_n - E(S_n)}{n} \to 0$ in probability when $n \to \infty,$ where $S_n = X_1 + \ldots + X_n.$
Finite variance gives us finite second moment, and we can use Chebyshev's inequality:
$$\forall \epsilon >0 \ \ \ P \left( \left| \frac{S_n}{n} - E \left( \frac{S_n}{n} \right) \right| \geq \epsilon \right) \leq\frac{\text{Var}\left( \frac{S_n}{n} \right)}{\epsilon^2} = \frac{\text{Var} (S_n)}{n^2 \epsilon^2} \leq \frac{\epsilon n^2}{n^2 \epsilon^2} = \frac{1}{\epsilon},$$ which tends to infinity when $n \longrightarrow \infty, \ \ \epsilon \longrightarrow 0$, yet we need it tending to $0$.
Also tried Cantelli's inequality (looks like Chebyshev's, but upper bound $\frac{2 \text{Var} (S_n)}{\epsilon^2 + \text{Var}(S_n)}$), which gave me estimate $2$ (just as useless).
Would be very grateful for any help.