Let $(X_n)$ be a sequence of independent random variables with $X_1=0$ and
$P(X_n=n)=P(X_n=-n)=\frac{1}{2n \log n}, P(X_n=0)=1-\frac{1}{n \log n}$ for $n \ge 2$.
$\overline X_n:=\frac{1}{n}\sum_{i=1}^{n}X_i$
I am trying to understand:
$$P(|\overline X_n-0|>\epsilon)\le \frac{\operatorname{Var}(\sum_{i=1}^{n}X_i)}{n^2\epsilon^2}=\frac{\sum_{i=2}^{n}\frac{i}{\log i}}{n^2 \epsilon^2}\le\frac{\frac{n^2}{\log n}}{n^2 \epsilon^2}=\frac{1}{\epsilon^2 \log n}\to0$$
Is the first inequality some form of Chebyshev's inequality?
Then why does $$\frac{\operatorname{Var}(\sum_{i=1}^{n}X_i)}{n^2\epsilon^2}=\frac{\sum_{i=2}^{n}\frac{i}{\log i}}{n^2 \epsilon^2}\,?$$
Any why is the numerator $\le$ $\frac{n^2}{\log n}$?
The first inequality is indeed Chebyshev's inequality. Note that $EX_n = \frac{-n}{2n\log n} + \frac{n}{2n \log n} = 0$ for all $n \ge 2$, so $E\overline{X} = 0$ because of the linearity of the expected value. The denominator is due to the fact that $\text{Var}(\overline{X}) = \frac{1}{n^2} \text{Var}(\sum_{i=1}^n X_i)$ (a general property of the variance).
For the second equality, $\text{Var}(X_n) = E(X_n^2)-(EX_n)^2 = E(X_n^2)$ was simply computed for all $n\in \mathbb{N}$. This should be straightforward, note that $X_n^2$ can only take on the values $0$ and $n^2$, if $n>1$.
With regards to your last question, we can see that $\frac{d}{dx} \frac{x}{\log x} = \frac{\log(x) - 1}{\log(x)^2}$, which is non-negative (i.e. non-decreasing) for $x \ge \text{e}$. Since $\frac{2}{\log2} = \frac{4}{\log 4}$, we may write $\sum_{i=2}^n \frac{i}{\log i} = \frac{4}{\log 4} + \sum_{i=3}^n \frac{i}{\log i} \le n \cdot \frac{n}{\log n}$ whenever $n > 3$, since the last summand will be the largest. A direct computation confirms the validity of the inequality for $n=3$, although that is obviously not needed since we only care about large $n$ either way.