I found the Lyapunov condition for applying the central limit theorem, which is useful in settings where one has to deal with non-identically distributed random variables:
Lyapunov CLT. Let $s_n^2 = \sum_{k=1}^n \text{Var}[Y_i]$ and let $Y=\sum_i Y_i$. If there exists $\ell>0$ s.t.: $\lim_{n\rightarrow\infty}\left( \frac{1}{s_n^{2+\ell}}\sum_{k=1}^n \text{E}\left[ |Y_k - \text{E}[Y_k]|^{2+\ell}\right] \right) = 0$, then $Z=(Y - E[Y])/\sqrt{Var[Y]}$ converges to the standard normal distribution.
While I have no problem showing that this condition holds for certain exercises, I'm wondering what the intuition is behind this condition.
This is linked to Lindeberg condition, namely, for each positive $\varepsilon$, $$\frac 1{s_n^2}\sum_{k=1}^n\int_{\{|X_k|\geqslant \varepsilon s_n\}}X_k^2\mathrm d\mathbb P\to 0.$$ We can prove the central limit theorem checking that $\mathbb E\left[f\left(\frac{S_n}{s_n}\right)\right]\to \mathbb E[f\left(N\right)]$, where $N\sim N(0,1)$, only for a class $f$ of smooth bounded functions. In particular, we don't need to to the test for all continuous bounded functions. Then with this idea in mind, we can use Taylor's formula and Lindeberg condition to control the remainder.
Now, if we are in the more favorable case in which we have moments of order $2+\delta$ for some positive $\delta$, then Lyapunov's condition $$\frac 1{s_n^{2+\delta}}\sum_{k=1}^n\mathbb E\left[|X_k|^{2+\delta}\right]\to 0$$ implies Lindeberg's one, and is easier to check.