Normal Approximation to Binomial Distribution using Moment Generating Functions

721 Views Asked by At

We're told not to use the central limit theorem to show that the normal approximation is suitable for a binomial distribution when n tends to infinity.

I've managed to show the answer, but it involves some messy infinite series.

Eg: $$M_Z(T) = M_{\frac{X-\mu}{\sigma}} = e^{-\frac{t\mu}{\sigma}}[1 + \theta(e^{\frac{t}{\sigma}} - 1]^n$$ $$ \mu = n\theta \ and \ \sigma = \sqrt{n\theta(1-\theta)}$$

Then without putting it all here, I take logarithms, use the infinite series of $ e^{\frac{t}{\sigma}}$ and then the infinite series of $ln(1+x)$, collect some terms and show that a lot tend to zero as n tends to infinity, leaving us with the Normal MGF. It's quite ugly, and I'm wondering if there is a clearer method (but still using MGFs)?