Prove the central limit theorem for a sequence of i.i.d. Bernoulli($p$) random variables, where $p\in(0,1)$.
I am trying to do this by computing the moment generating function of the object I want the limit of and use Taylor's expansion to show that it converges to the moment generating function of a standard normal.
Attempt: For a random variable X, we have the moment generating function $$M_{X}(t)=\mathbb{E}[e^{tX}]$$ and if we expand using the Taylor series of $e^{tX}$ we get $$M_X(t)=\sum_{n=0}^\infty \frac{\mathbb{E}[X^n]}{n!}t^n.$$ So, we have in particular $$M_x^{(n)}(0)=\mathbb{E}[X^n].$$
Now for the proof, we have a Bernoulli random variable is 1 with probability $p$ and 0 with probability $(1-p)$. First we want the moment generating function for a Bernoulli random variable. In particular, we have $$M_{Bernoulli(p)}=(1-p)+pe^t=1+(e^t-1)p$$ Then, we have a binomial random variable as the sum of n independent Bernoulli variables. So, $$M_{binomial(n,p)}=((1-p)+pe^t=(1+(e^t-1)p)^n$$ Suppose that $p=\lambda/n$ and notice that $$M_{binomial(n,\lambda/n)}=\bigg(1+\frac{(e^t-1)\lambda}{n}\bigg)^n \rightarrow e^{\lambda(e^t-1)}$$
I am not sure how to finish the proof though or if I am even on the right track...