if $\left(a_n\right)$ is a sequence in $\left(0,1\right),$ show that $\frac{1}{n} \sum_{k=1}^{n} a_k \rightarrow 0$ if and only if $\frac{1}{n} \sum_{k=1}^{n} a_k^2 \rightarrow 0$.
The latter summation's terms are smaller than the former but greater than 0. So by sandwich theorem I had established former's convergence imply latter's convergence.
How do I prove the other part?
When $\frac 1n \sum_{k=1}^n a_k^2\to 0$, apply Cauchy Schwarz to get $$0\leq \frac 1n \sum_{k=1}^n a_k \leq \frac 1n \sqrt{\sum_{k=1}^n a_k^2}\sqrt{\sum_{k=1}^n 1} = \sqrt{\frac 1n\sum_{k=1}^n a_k^2} $$
and conclude by squeezing.