If a function $f:\mathbb R\to\mathbb R$ is $o(x)$ in a neighborhood of $x=0$, then it follows trivially that $f(0) = f'(0) = 0$.
Analogously, suppose a function $f$ is $o(x^k)$ as $x\to 0$, where $k$ is a positive integer. (That is, $\lim_{x\to 0} f(x)/x^k = 0$.) In this case, if $f$ is $k$ times differentiable at the origin, I can deduce that $f(0) = f'(0) = f''(0) = \cdots = f^{(k)}(0) = 0$. Is the $o(x^k)$ condition strong enough to show that these first $k$ derivatives actually exist? If so, how? If not, what is a counterexample?
If $f$ is assumed to have derivatives up to order $k$ in a neighborhood of $x=0$ then $f(x) = o(x^k)$ implies $$ f(0) = f'(0) = f''(0) = \ldots = f^{(k)}(0) = 0 \, . $$ This follows for example from Taylor's theorem $$ f(x) = f(0) + f'(0)x + \frac{f''(0)}{2} x^2 + \ldots + \frac{f^{(k)}(0)}{k!} x^k + h_k(x) x^k $$ with $\lim_{x \to 0} h_k(x) = 0$.
But for $k \ge 2$, $f(x) = o(x^k)$ does not imply the existence of $f^{(k)}(0)$, a counter-example is $$ f(x) = |x|^{k+1} \text{ times the characteristic function of the rationals} $$ which is only differentiable at $x=0$, so that $f''$ nowhere exists.