Suppose $a_i \sim \mathcal{N}(0,1)$ are iid. Then for any $\epsilon>0$ there is a constant $C >0$ such that $m \ge C\cdot n$ implies \begin{align} X_1 = \frac{1}{m}\sum_{i=1}^{m}\left(a^{2}_i - 1\right) &\le \epsilon \\ X_2 = \frac{1}{m}\sum_{i=1}^{m}\left(a^{4}_i - 2\right) &\le \epsilon \\ X_3 = \frac{1}{m}\sum_{i=1}^{m}a^{6}_i &< 10 \end{align} happen simultaneously with probability atleast $1 - 3n^{-2}$.
This is an intermediate step in proof on page 38 from a paper which is why the constants might look arbitrary or not tightest possible. The authors say it is a consequence of Chebyshev's inequality.
Here is my attempt at proving it. If we show that $\mathbb{P}\left[X_1 > \epsilon \right] \le 1/n^2$ and similarly inequalities for $X_2, X_3$ with $m \ge C\cdot n$ a union bound will give the result. However I am not able to prove the inequality even for $X_1$.
Clearly $\mathbb{E}\left[X_1\right] = 1$. Also variance of $X_1$ is, \begin{align} \operatorname{Var}\left[X_1\right] &= \frac{1}{m^2} \mathbb{E} \left[ \left(\sum_{i=1}^{m}\left(a^{2}_i - 1\right) \right)^2 \right] \\ &= \frac{1}{m^2} \sum_{i=1}^{m}\mathbb{E} \left[\left(a^{2}_i - 1\right)^2 \right] \\ & = \frac{1}{m^2} \sum_{i=1}^{m}\mathbb{E} \left[a^{4}_i + 1 - 2a_i^2 \right] \\ &= \frac{1}{m^2} \sum_{i=1}^{m} (3 + 1 - 2) = \frac{2}{m} \end{align}
By Chebyshev's inequality, \begin{align} \mathbb{P}\left[X_1 > \epsilon \right] &\le \mathbb{P}\left[\left\lvert X_1\right\rvert > \epsilon \right] \\ &\le \frac{\operatorname{Var}\left[X_1\right]}{\epsilon^2} \\ &= \frac{2}{m\epsilon^2} \\ &\le \frac{2}{Cn\epsilon^2} \quad \text{using $m \ge Cn$} \\ &\le \frac{1}{n} \quad \text{for $C \ge 2\epsilon^{-2}$} \end{align}