Let $f:\mathbb{R} \rightarrow \mathbb{R}$ be a smooth function. Let $R$ be the radius of convergence of the Taylor series centered at $a.$ For each $n \in \mathbb{N},$ let $M_n= \sup\{f^{n}(t) : t \in (a-R, a+R)\}$ and assume $\displaystyle\lim_{n\to \infty}\frac{M_n}{n!}R^n < \infty.$
Show that $x \in(a-R, a+R) \implies \displaystyle\lim_{n\to \infty}\frac{M_n}{n!}(x-a)^n=0.$
I'm having a hard time figuring out what exactly is going on here. I wonder if someone could just help me understand why the above is true, without going into details about how to prove it.
Thanks!
You are given that $\lim_n {M_n \over n!} R^n = L < \infty$. If $x \in (a-R,a+R)$, then $d=|x-a| < R$.
We have ${M_n \over n!} |x-a|^n = {M_n \over n!} d^n = {M_n \over n!} R^n ({d \over R})^n$.
Let $a_n = {M_n \over n!} R^n$, $b_n = ({d \over R})^n$. We are given that $a_n \to L$ and we see that $b_n \to 0$, hence $a_n b_n \to 0$.