In my textbook, Lyapunov's condition is shown to imply Lindeberg's condition by:
$$\sum_{k=1}^nE1_{(|X_{nk}|>c)}X_{nk}^2\le\sum_{k=1}^nE1_{(|X_{nk}|>c)}\tfrac{1}{c^\delta}|X_{nk}|^{2+\delta}\le\frac{1}{c^\delta}\sum_{k=1}^nE|X_{nk}|^{2+\delta}$$
Hence if $\sum_{k=1}^nE|X_{nk}|^{2+\delta}$ converges, so does $\sum_{k=1}^nE1_{(|X_{nk}|>c)}X_{nk}^2$.
What I don't get is why we need $\delta>0$. It seems like the above argument works just as well for $\delta=0$? Yet any source I can find requires $\delta>0$ which gives them a lot of problems with existence of moments.
Am I missing something totally obvious here?
Lindeberg's condition is $1/s_n^2$ times your left side (with $c = \epsilon s_n$) $\to 0$ as $n \to \infty$. The $1/c^\delta$ (which becomes $1/(\epsilon s_n)^\delta$) with $\delta > 0$ is needed to make this go to $0$.
Let's look at the simple case where $X_{nk}$ are iid, and $s_n = \sqrt{n} \sigma$ where $\sigma$ is the standard deviation of $X_{nk}$. With $\delta = 0$, $\sum_{k=1}^n E[ X_{nk}^2] = n$, and after dividing by $s_n^2$ all you get is boundedness rather than convergence to $0$.