Currently going through Stephen Abbot's Understanding Analysis and I've just gotten to the proof of Lagrange's Remainder Theorem, which states that if we have $x\ne 0$ then there is some $|c|<|x|$ such that: $$ E_N (x) = f (x) − S_N (x) = \frac{f^{(N+1)}(c)}{(N+1)!}x^{N+1}$$ He then proceeds to not do much else with taylor remainders.
From my perspective, this theorem is completely useless as a conclusion. It tells us how the error behaves at one specific point (and we don't even know where that is), and cannot even bound it in an $\epsilon$-neighborhood, when what we really want to do is upper bound $E_N(x)$ in some reasonably sized interval.
Are there any proofs of the error bounds of Taylor Series on intervals that are accessible at this stage of the book?