Note that I am specifically looking at this version of Taylor's theorem:
Let $f: \mathbb{R}\to\mathbb{R}$ be $n$-times differentiable at $x$. Then $\exists\,g : \mathbb{R} \to \mathbb{R}$ where $\lim_\limits{h\to 0}g(x + h) = 0$ and $$f(x + h) = h^n\ g(x + h) + \sum_{k=0}^n\frac{h^k}{k!}\,f^{(k)}(x)$$
My question is, what makes this a nontrivial theorem? Can't we just solve for $g$ in a single step?
Why is the observation that $g$ exists useful at all?
(Answering my own question since this is something that had been confusing me sometime ago and so I had posted this question for others' reference.)
The nontriviality is in the condition
$$\lim_\limits{h\to 0}g(x + h) = 0$$
The theorem would be trivial if this condition did not need to hold, since we could already easily solve for a $g$ that satisfies the equation by virtue of its $h^n$ coefficient, but it is far from obvious that such a $g$ can necessarily satisfy this condition as well.
Clarification: Note that $h^n$ does not pose a problem for $g$ despite the division by zero when solving for $g$. This is because $g(x) = 0$ could be defined piecewise, avoiding the division by zero. The problem really is just the limit (continuity) requirement on $g$ at $x$, not the mere existence of such a function $g$ that satisfies the equation.