I am interested in the following statement:
A triangular array of globally bounded real non-constant random variables converges in distribution towards the standard Gaussian distribution if and only if its variance goes towards infinity.
In technical terms, this is
Fix $C>0$ and for each $n \in \mathbb{N}_+$ and $1 \leq i \leq n$ let $X_{i,n}$ be a random variable with support contained in $[-C,C]\subset \mathbb{R}$, zero mean and nonzero variance $s_{i,n}^{2} > 0$, and set $X_{n} = \sum_{i=1}^{n} X_{i,n}$ and $s_n^2 = \mathbb{V}(X_n) = \sum_i s_{i,n}^2$.
Then $$X_{n} / s_n \rightarrow N(0,1) \Longleftrightarrow s_n \rightarrow \infty.$$
The reverse implication follows from the observation that the Lindeberg condition is satisfied (because each individual $X_{i,n}$ is bounded).
But what about the forward implication?
I would like to argue as follows:
If $s_n \rightarrow S < \infty$ then the limit distribution $\lim_{n \rightarrow \infty} X_n/s_n$ exists and is not equal to $N(0,1)$ because well, I don't know, maybe an explicit computation or because of Cramer's decomposition theorem ???
Now if $0 < s_n \not\rightarrow \infty$ then it has a bounded and thus a converging subsequence $s_{m_n} \rightarrow S < \infty$. By 1., the corresponding subsequence $X_{m_n}$ of $X_n$ does not converge towards $N(0,1)$ so $X_n$ itself cannot converge towards $N(0,1)$ either.
The first part is unclear to me, and I used in the second part that, if $X_n$ converges in distribution to $X$, then also every subsequence of $X_n$ converges towards $X$. This seems a direct consequence of the definition of convergence in distribution, but I have not found this statement anywhere written down in a text book.
I would very much appreciate pointers to references where these situations are treated and also some words on the correctness of my arguments.
Also, the situation I am actually needing is the above with the additional property that all var's are boolean.