I am not sure my attempt is correct. I have seen proofs for the convergence of sample variance in probability. I wonder if showing $L^2$ convergence (in quadratic mean) could be an easier way of showing convergence in probability.
Let ${X_i}$ be iid with $\mu = \mathbb{E}(X_1)$ and $\sigma^2 = \mathbb{V}(X_1)$
$S_{n}^2 = \frac{1}{n-1} \sum (X_i - \bar{X_n})$
My attempt:
$\mathbb{E}[(S_{n}^2 - \sigma^2 )^2] = \mathbb{E} [(S_{n}^2)^2 - 2S_{n}^2\sigma^2 + (\sigma^2)^2 ] \\= \mathbb{E}[S_{n}^2] -2\sigma^2\mathbb{E}[S_{n}^2]+ (\sigma^2)^2 \\= \mathbb{V}[S_{n}^2] + \mathbb{E}[S_{n}^2]^2-(\sigma^2)^2 \\= \sigma^2/n$
This tends to zero as n tends to infinity and with markov inequality we show convergence in probability, or rather $L^2$ implies convergence in probability.
My only concern could be that I am misunderstanding something because I could find a similar statement anywhere and my textbook (All of Statistics, Wasserman) gives a hint that isn't similar.