I have seen two versions of Central Limit Theorems (CLTs) with Lindeberg's condition as a requirement. One for triangle schemes of the form $\{X_{n,k}|1 \leq k \leq n, n\in \mathbb{N} \}$
(1)$$\forall \epsilon>0 : \underset{n \rightarrow \infty}{\lim}\sum_{k=1}^{n} E[ X_{n,k}^2 \mathbb{1}_{\{|X_{n,k}| < \epsilon\}}] = 0$$
and another one for a 'simple' sequence of random variables (RVs)
(2)$$\forall \epsilon>0 :\underset{n \rightarrow \infty}{\lim}\frac{1}{s_n}\sum_{k=1}^{n} E[ X_{k}^2 \mathbb{1}_{\{|X_k| < \epsilon s_n\}}] = 0$$
where $s_n^2 := \sum_{k=1}^{n} \sigma_{k}^2$. In both cases the expectations of each individual RV are $0$ and the variances $\sigma_{n,k}^2$ and $\sigma_{k}^2$ respectively are finite (in the case of the triangle schemes one also demands that $\lim_{n \rightarrow \infty} s_n^2 = \sigma^2$ for some $\sigma \in \mathbb{R}_{+}$). You can see a more general form of the condition and a corresponding CLT here:
https://en.wikipedia.org/wiki/Lindeberg%27s_condition
My understanding so far is that Lindeberg's condition guarantees that the variances are 'sufficiently small' and don't escalate towards infinity at any point. Feel free to correct me on that one. Now what I don't understand at all is why the indicator function $\mathbb{1}_{\{|X_k| < \epsilon\}}$ is in that sum. Couldn't I demand
$$\underset{n \rightarrow \infty}{\lim}\sum_{k=1}^{n} E[ X_{n,k}^2] = 0$$
in the case of (1) and
$$\underset{n \rightarrow \infty}{\lim}\sum_{k=1}^{n} E[ X_{k}^2 ] = \sigma^2 < \infty$$
in the case of (2) instead? I realize the question on the intuition of Lindeberg's condition (Interpreting the Lindeberg's condition) still does not have an answer, so I don't expect much. But any light you could shine on this issue in particular would be greatly appreciated.
I believe your condition: $$\underset{n \rightarrow \infty}{\lim}\sum_{k=1}^{n} E[ X_{n,k}^2] = 0$$
Is too restrictive since it does not allow, for example, the case where the $X_{n,k}$ are a linear combination of iid random variables.
Given a sequence of iid random variable $\xi_i$ with mean 0 and variance 1 and a (non-random) sequence of $a_i \in \mathbb R$ satisfying:
$$\max_i^n \frac{|a_i|}{\|a_i\|_2} \rightarrow 0$$
Now, define the normalized elements of the linear combination:
$$X_{n,i} = \frac{a_i \xi_i}{\|a\|_2}$$
Which satisfies the Lindeberge condition, but not your more restrictive condition:
$$\underset{n \rightarrow \infty}{\lim}\sum_{k=1}^{n} E[ X_{n,k}^2] = \underset{n \rightarrow \infty}{\lim} \mathbb E \xi^2 = 1$$
As a final note, I believe your statement of the Lindeberge condition is incorrect, the thersholding should involve $\{X_i > \epsilon\}$, not $\{X_i < \epsilon\}$.
See: https://people.stat.sc.edu/gregorkb/STAT_824_sp_2021/STAT_824_Lec_11_supplement.pdf