Let $X_n$ be a sequence of $\mathbb N_0$ valued random variables and denote by $g_{X_n}$ their generating function, i.e. $g_{X_n}(s) = \mathbb E[s^{X_n}] = \sum_{k=0}^{\infty} s^k \mathbb P(X_n=k)$.
I want to show that
$$X_n \rightarrow X \text{ weakly} \Leftrightarrow g_{X_n} \rightarrow g_{X} \text{ pointwise on } [0,1]$$
It boils down to exchanging limit in $X_n$ and the expectation value, but neither monotone convergence nor dominated convergence theorem are applicable.
How can one proof the above statement?
First, note that for random variables with values in $\Bbb N$, $X_n\to X$ weakly is equivalent to $$\tag{C}\forall k\in\Bbb N,\quad \lim_{n\to +\infty} P(X_n=k)=P(X=k).$$ We thus have to show that this condition is equivalent to $$\tag{C'}\forall s\in [0,1],\quad \lim_{n\to +\infty} g_{X_n}(s)=g_X(s).$$ Assume that (C) holds. We have pointwise convergence for $s=1$ by definition of a probability measure; for $0\leqslant s<1$, we use the fact that $\sum_ks^k$ is convergent: $$|g_{X_n}(s)-g_X(s)|\leqslant \sum_{k=1}^Ns^k|P(X_n=k)-P(X=k)|+\sum_{k\geqslant N+1}s^k.$$ Conversely, assume that (C') holds. Integrating and using the dominated convergence theorem, one can show that for each integer $p$, we have $$\lim_{n\to +\infty}\sum_{k=0}^{+\infty}P(X_n=k)\frac 1{(k+1)^p}=\sum_{k=0}^{+\infty}P(X=k)\frac 1{(k+1)^p},$$ and by an approximation argument, we can show that for each $f\in C[0,1]$, $$\lim_{n\to +\infty}\sum_{k=0}^{+\infty}P(X_n=k)f\left(\frac 1{k+1}\right)=\sum_{k=0}^{+\infty}P(X=k)f\left(\frac 1{k+1}\right).$$ From this, one can conclude convergence in law.