Let $(X_n)$ be a sequence of independent random variables, $X_k \sim \text{Ber}(p_k) \ \forall k \ge 1$. Set $S_n = \sum_{k = 1}^n X_k, m_n = \sum_{k = 1}^n p_k, s_n^2 = \sum_{k = 1}^n p_k(1-p_k)$. Show that $$ \dfrac{S_n -m_n}{s_n} \xrightarrow{d} \mathcal{N}(0, 1) \Longleftrightarrow \sum_{k = 1}^{\infty} p_k(1-p_k) = +\infty $$
I'm able to show the $\Leftarrow$ direction by checking the Linderberg conditions. However, I'm getting stuck at the $\Rightarrow$ direction, and I don't know any possible directions to show this, so any hints are appreciated, thank you.
Update: The Lindeberg conditions that I'm using is in Allan Gut's "Probability: A Graduate Course" book. In this book, the author state the following Lindeberg-Levy-Feller theorem:
Theorem. Let $X_1, X_2, \ldots$ be independent random variables with finite variances, and set, for $k \ge 1, \mathbb{E}(X_k) = \mu_k, \text{Var}(X_k) = \sigma_k^2$ and, for $n \ge 1$, $S_n = \sum_{k = 1}^n X_k, s_n^2 = \sum_{k = 1}^n \sigma_k^2$. The Lindeberg conditions requires: $$ L_1(n) = \max_{1 \le k \le n} \dfrac{\sigma_k^2}{s_n^2} \rightarrow 0 \text{ as } n \rightarrow \infty \tag{1} $$ and $$ L_2(n) = \dfrac{1}{s_n^2}\sum_{k = 1}^n \mathbb{E}[\vert X_k - \mu_k \vert^2 1\{\vert X_k - \mu_k \vert > \epsilon s_n\}] \rightarrow 0 \text{ as } n \rightarrow \infty\tag2 $$ Then: (i) If $(2)$ is satisfied, then so is $(1)$ and $$ \dfrac{1}{s_n}\sum_{k = 1}^n (X_k - \mu_k) \xrightarrow{d} \mathcal{N}(0, 1) \tag3 $$ (ii) If $(1)$ and $(3)$ are satisfied, so is $(2)$.
Suppose not. Then $s_n\to s$ for some $s$ and $Z_n := \sum_{k=1}^n(X_k - p_k) \leadsto N(0, s^2)$. Let $Z_{-1,n} = \sum_{k=2}^n(X_k - p_k)$ so that $(X_1 - p_1) + Z_{-1,n} \leadsto N(0, s^2)$. Because $Z_{-1,n}$ must converge in distribution to some $Z_{-1}$ and, independently, $X_1 - p_1$ converges in distribution to $X_1 - p_1$, we have that $(X_1 - p_1) + Z_{-1}$ is normal. By Cramer's decomposition theorem, these two variables must be normal. This contradicts that $X_1$ is Bernoulli.