Let $f\colon\mathbb{R}\to\mathbb{R}$ be a real function, and let $0\leq n\leq+\infty$. We make the following assumption:
For every $a \in\mathbb{R}$ and for $k=n$ (resp., in the case $n=+\infty$: for any $k\geq 0$), there exist real numbers $c_0(a),\ldots,c_k(a)$ such that
$$f(x) = c_0(a) + c_1(a)\,(x-a) + \frac{1}{2}c_2(a)\,(x-a)^2 + \cdots + \frac{1}{k!}c_k(a)\,(x-a)^k + o((x-a)^k)$$
where, as usual, $o((x-a)^k)$ means $(x-a)^k\,\varepsilon_{a,k}(x)$ for some function $\varepsilon_{a,k}$ tending to $0$ when $x \to a$.
In other words, we assume that $f$ has a power expansion of order $k$ with $o$ error term at (every) $a$. Note that no assumption is made on uniformity of the $o$ error term when $a$ varies (e.g., we do not assume that $\varepsilon_{a,k}(x)$ is bounded by some function of $x-a$): we only assume that for every $a$ there exists an expansion of order $k$ as above, nothing more.
Naturally, the $c_i(a)$ are uniquely determined, we have $c_0 = f$ (that is, $c_0(a) = f(a)$ for every $a$) and $f$ is continuous; and moreover, as soon as $n\geq 1$, clearly, $f$ is differentiable with derivative $f' = c_1$.
We cannot deduce that $f$ is twice differentiable, or even $C^1$, from the above hypothesis alone, no matter how large $n$ is. The simple example of $f(x) = x^{n+1} \sin(x^{-n})$ provides a counterexample (it is $o(x^n)$ at $0$ and analytic everywhere else, so it has a power expansion of order $n$ everywhere, yet it is easily seen that it is not even $C^1$ at $0$); a slightly more complicated counterexample works for $n=\infty$.
Now here is my question. Let us make the following additional assumption (which is not satisfied for the above counterexample):
For each $0\leq k\leq n$, the function $c_k$ (that is, $a\mapsto c_k(a)$) is continuous.
(In particular, if $n\geq 1$, it is now clear that $f$ is $C^1$.)
Can I conclude from both assumptions that $f$ is $C^n$? (Or, if not, can I conclude something non-trivial?)
This question is answered in the affirmative in Abraham, Robbin, Transversal mappings and flows, Ch.1, $\S$2, A criterion for smoothness. They prove this converse to Taylor's theorem for functions between Banach spaces and attribute the one-dimensional case to Marcinkiewicz, Zygmund, On the differentiability of functions and summability of trigonometrical series.
As I understand after a glimpse at the proof, they prove by induction that $c_k = f^{(k)}$ by proving that $c_k(a+h) - c_k(a) = \int_0^1 c_{k+1}(a+th)h \, dt$. To prove that $f$ is $C^n$ and thus to justify the above, they prove that $c_1$ satisfies the hypothesis of the theorem with $n$ replaced by $n-1$ and then use induction (in the finite dimensional case; a trick using Hahn-Banach permits to reduce the theorem to that case). The proof of that fact looks elementary but tricky; in particular, they use a polynomial interpolation lemma.