I provide most of the results without explicit calculations, but question is fairly short at the end.
Let $X_i$ be i.i.d. with density $f(x) = |x|^{-3}$ for $|x|>1$ and $0$ else. Claim is; \begin{equation} (n\log n)^{-1/2}\sum_{i}X_i \to N(0,\sigma^2) \end{equation} in distribution, where $N$ is normal distribution and $\sigma$ to be determined.
Define a Triangular Array of RVs as $$Y_n^k = \frac{\bar{X_k}}{\sqrt{n\log n}}$$ where $\bar{X_k}$ is truncated version of $X_k$ at $\pm\sqrt{n\log n}$. Now, $$E(Y_n^k)^2 = \frac{1}{n} + o\left(\frac{\log\log n}{n\log n}\right)$$ Therefore, $$\sum_k E(Y_n^k)^2 = 1 + o\left(\frac{\log\log n}{\log n}\right) \to 1$$ To verify Lindeberg's Condition applies, $$E[(Y_n^k)^2\ | \ |Y_n^k| > \epsilon]= \frac{\log\epsilon}{n\log n} + o\left(\frac{1}{n^2\log^2n}\right)$$ Hence, $$\sum_k E[(Y_n^k)^2\ | \ |Y_n^k| > \epsilon] = \frac{\log\epsilon}{\log n} + o\left(\frac{1}{n\log^2n}\right) \to 0$$ Which concludes; $$(n\log n)^{-1/2}\sum_{k} \bar{X_k} =\sum_k Y_n^k \to N(0,1)$$
Question How to argue $(n\log n)^{-1/2}\sum X_k$ converges?
It is true that each term $\bar{X_k}$ converges a.s. to $X_k$, and I suspect it might be a simple step.
Let $\tilde{X}_k^n$ be the truncated version of $X_k$. For fixed $k$, $n$, we have:
$$\mathbb{P} (X_k \neq \tilde{X}_k^n) = \mathbb{P} (|X_k| \geq \sqrt{n \ln (n)}) = \frac{1}{n \ln (n)},$$
so that:
$$\mathbb{P} (\exists 0 \leq k < n: \ X_k \neq \tilde{X}_k^n) \leq \sum_{k=0}^{n-1} \mathbb{P} (X_k \neq \tilde{X}_k^n) = \frac{n}{n \ln (n)} = \frac{1}{\ln (n)}.$$
Hence, $(n \ln (n))^{-1/2} \sum_{k=0}^{n-1} X_k = \sum_{k=0}^{n-1} Y_k^n$ with probability $1-O (1/\ln (n))$. Sine the right hand-side converges in distribution to a standard normal, so does the left hand-side.