I really don't want to prove this using induction. Is there a clever way I can use the m.g.f. definition to prove it?
i.e. If $M(t)=E(e^{tX})$, and if $M^{(m)}(t)$ represents the $m$th derivative of $M(t)$ for $m=1,2,3,\ldots$, then $M^{(m)}(0)=E(X^{m})$.
If there is, I'm not sure exactly how to start. I mean $e^{tX}$ is already some function of $X$, call it $u(X)$. Then $M(t)=E[u(X)]$. But $e^{t(X-b)}$ is a different function of $X$, call it $w(X)$, but that doesn't really get me anywhere.
Although, can't I just let $Y=X-b$ and just impose $$M_{Y}(t)=E(e^{tY}) \implies M_{Y}^{(m)}(0)=R^{(m)}(0)=E(Y^{m})$$?
Here is a rigorous proof: $e^{a}+e^{-a} \geq e^{|a|}$ for any real number $a$. It is implicitly assumed that $R(t) <\infty$ for $-h <t<h$ so we can now conclude that $Ee^{|t||X-b|} <\infty$ for $|t|<h$. An application of Fubini/Tonelli's Theorem now shows that $R(t)=E \sum\limits_{i=0}^{\infty} \frac {t^{n}(X-b)^{n}} {n!}= \sum\limits_{i=0}^{\infty} \frac {t^{n}E(X-b)^{n}} {n!}$ By properties of power series this implies that $R^{(m)} (0)=E(X-b)^{n}$.
[If $R(t)=\sum a_n t^{n}$ for $|t| <h$ then $f^{(n)} (0)=a_n n!$].