Let $X_1, X_2, ..., X_n, ...$ be independent random variables. Assume that for each $n$, the random variable $X_n$ is distributed uniformly on $[0,n]$. Find a sequence $a_n$ such that $(X_1^2 + ... + X_n^2) / a_n$ converges to $1$ in probability.
I honestly have no idea how to start this problem. The only things that come to mind is to make a Law of Large Numbers type argument or to try to brute force it by using the definition of convergence in probability.
If $\{Y_n\}$ are non-negative random variables converging in distribution to $Y$, then $$\mathbb E(Y)\leqslant \liminf_{n\to \infty}\mathbb E(Y_n),$$ hence we should have $$1\leqslant \liminf_{n\to \infty}\frac 1{a_n}\sum_{j=1}^n\frac{j^2}3.$$
If we take $a_n:=\sum_{j=1}^n\frac{j^2}3$, we can compute $$\frac 1{a_n^2}\mathbb E\left(\sum_{j=1}^n\{X_j^2-\mathbb E(X_j^2)\}\right)^2.$$