Let $X_i^{(n)} \sim \operatorname{Ber}(p_{i,n})$ for all $n\in \mathbb N$ and $i\in \{1,\dots n\}$ Bernoulli random variables on a probability space $(\Omega, F, \mathbb P)$, such that $X_1^{(n)},\dots X_n^{(n)}$ are independent for all $n$. And let $\alpha >0 $ exist such that $\lim_{n\to\infty}\max_{i\in \{1,\dots,n\}}\mid np_{i,n}-\alpha \mid =0$+ Calculate the characteristic function of $S_n:= \sum_{i=1}^n X_i^{(n)}$ for all $n$. Then show that the distribution of $S_n$ converges weakly for $n\to\infty$ and identificate its limit.
My approach: Since $X_i^{(n)}$ are independent, for their characteristic functions holds $\phi_{X_1^{(n)}+\dots +X_n^{(n)}}=\phi_{X_1^{(n)}}\dots\phi_{X_n^{(n)}}$.
We have $$\phi_{X_j^{(n)}}=1-p_j+p_je^{it}$$ and therefore we get $$\phi_{S_n}=\prod_{j=1}^n(1-p_j+p_je^{it})$$
Now I want to show that this function converges pointwise for $n\to\infty$ to some function $f$ which is continuous in $0$ and then by Levy's Theorem the distribution of $S_n$ will converge to the distribution corresponding to $f$.
But here I am struggling. I tried to consider $\lim_{n\to\infty} \exp(\log(\phi_{S_n}))= \exp(\sum_{j=1}^{\infty} \log(1-p_j + p_je^{it})) $, but I am unable to continue.
I think that this is somehow related to Poisson approximation Theorem.
How does one find its limit here? And why did the author write $X_i^{(n)}$ instead of simply $X_i$, i.e. what is this index good for?
As you noted, if $p_{i,n}$ only depends on $n$, then the $S_n$ are binomial, and the Poisson approximation theorem tells you the limiting distribution is Poisson with mean $\lim_{n \to \infty} n p_n$. Sangchul Lee's hint is that under the slightly more general conditions in your post, the limiting distribution is still Poisson.
The Poisson distribution with mean $\alpha$ has characteristic function $\exp(\alpha (e^{it} - 1))$. So you need to show $$\sum_{i=1}^n \log(1 - p_{i,n} + p_{i,n} e^{it}) \to \alpha (e^{it} - 1).$$