Let $\{X_n\}$ be a sequence of uncorrelated random variables with common mean $\mu$, such that $\sup_n$Var$(X_n)<\infty$. If $S_n=\sum_{k=1}^n X_k$, show that $n^{-2}\sum_{k=1}^nS_k$ converges in probability as $n\to\infty$ and identify the limit.
I know that using Weak Laws of Large Numbers, it is easily proven that $\frac{S_n}{n}\to\mu$ in $L^2$, and thus converges to $\mu$ in probability as well. But I'm having trouble using this result to prove that $n^{-2}\sum_{k=1}^nS_k$ converges in probability. I think it would help to know the limit (at least intuitively) to prove this, but I'm a bit stuck on this problem.
My attempt: First note that $$\frac{\sum_{k=1}^nS_k}{n^2}\leq\frac{nS_n}{n^2}=\frac{S_n}{n}$$ Then since we know $E(\lvert \frac{S_n}{n}-\mu\rvert^2)\to 0$, $E(\lvert\frac{\sum_{k=1}^nS_k}{n^2}-\mu \rvert^2)\leq E(\lvert \frac{S_n}{n}-\mu\rvert^2)$ by monotonicity (since expectations are integrals, and monotonicity holds for integals), and thus $E(\lvert\frac{\sum_{k=1}^nS_k}{n^2}-\mu \rvert^2)\to 0$ as $n\to\infty$. Now using Markov's inequality, I can conclude that $\frac{\sum_{k=1}^nS_k}{n^2}\to\mu$ in probability.
I kind of get the feeling that my answer may be incorrect. I would love for someone to provide some hints/point me in the right direction. Thanks
It follows from Chebyshev's inequality that if $\{Y_n\}$ is a sequence of random variables with finite second moments such that $\mathbb{E}[Y_n]\to\mu$ and $\mathrm{var}(Y_n)\to0$, then $Y_n\to\mu$ in probability.
Now note that we can write $$ \sum_{k=1}^nS_k=\sum_{k=1}^n(n-k+1)X_k$$ Therefore $$ \mathbb{E}\Big[n^{-2}\sum_{k=1}^nS_k\Big]=\frac{n(n+1)}{2n^2}\mu\to \frac{\mu}{2}$$ as $n\to\infty$, and since the $X_k$ are uncorrelated with $\sup_k\mathrm{var}(X_k)<\infty$, $$ \mathrm{var}\Big(n^{-2}\sum_{k=1}^nS_k\Big)=n^{-4}\sum_{k=1}^n(n-k+1)^2\mathrm{var}(X_k)\to0$$ as $n\to\infty$, so the result follows by setting $Y_n=n^{-2}\sum_{k=1}^nS_k$.