Taylor's Theorem Remainder with Unbounded Derivative

561 Views Asked by At

According to the Wikipedia entry and a few I've seen online, the remainder form with a $(n+1) \text{th}$ derivative can be used as long as $f: \mathbb R \to \mathbb R$, is $n+1$ times differentiable and $f^{(n)}$ is continuous. I am going to assume bounded intervals here since I think that is implicit in most cases. The remainder form is written like this: \begin{equation} f(x)=f(a)+\sum_{k=1}^{n}\frac{f^{(k)}(a)(x-a)^k}{k!}+ \frac{f^{(n+1)}(c)(x-a)^{n+1}}{(n+1)!} \end{equation} and so there is a $(n+1) \text{th}$ derivative involved, and my question is how do we know that the derivative is bounded? According to the comments on this answer: https://math.stackexchange.com/a/492165/463358, it needs to be asserted that the $(n+1) \text{th}$ derivative is bounded, but then someone comments that the assumptions above are enough to imply the boundedness.

I am currently thinking that the boundedness isn't implied by the conditions above, since $f^{(n)}$'s continuity seems to at best imply uniform continuity on bounded intervals, but not something stronger like absolute or Lipschitz. And since $f^{(n+1)}$ is not necessarily continuous we can't assert that it is bounded on a bounded interval.

I am not sure what I'm missing here and I've searched around a lot but can't find anything clear enough. Thanks for the help!

(There's a similar question here: Conditions of the Taylor Theorem, but it doesn't seem to address my particular question.)

2

There are 2 best solutions below

5
On BEST ANSWER

As pointed out in comments, the derivative $f^{(n+1)}$ does not have to be bounded and the remainder form is still valid.

Consider the example where $n=0$ and $f(x) = x^2 \sin (x^{-2})$ for $0 < x \leqslant 1$ and $f(x) = 0$ for $x = 0$. Here, $f$ is continuous and differentiable at every point in $[0,1]$. In particular, $f(0) = f'(0) = 0$ and for $0 < x \leqslant 1$,

$$f'(x) = 2x \sin(x^{-2}) - 2x^{-1} \cos(x^{-2}) $$

where $f'$ is unbounded in a neighborhood of $x = 0$ due to the second term on the RHS.

However, for any $x \in (0,1]$ there exists $\xi$ such that

$$x^2 \sin(x^{-2}) = 2\xi \sin(\xi^{-2}) - 2\xi^{-1}\cos(\xi^{-2})$$

For example, taking $x=1$, check that $\xi \approx 0.0118822803500313$ works (and there are many more admissible values).

What this is telling you is that even if a low-order Taylor approximation is valid with this remainder, the relative error represented by that remainder can be extremely large-- as in this example of the constant approximation to a wildly oscillating function.

0
On

I came across an awesome PDF document (link at bottom) that shows the Taylor's Polynomial of the function sin(x) and how the remainder is generated. The PDF makes clear that the coefficient of the remainder is essentially just a number that bounds the error. Because the proof of based on Rolles theorem, changing the internal [a,b] even a bit could cause the c of f(c);to deviate strongly from whatever prior constant of the error term is assumed. So the example illustrates that K (proportional to the error) is a number that depends on the interval you choose, and that Roles theorem breaks the requirement for K to vary smoothly as the interval you consider changes.

Another fun fact. For any (n+1)th derivative, the remainder will be a finite number, so the existence of a finite number isn't enough to show that the function converge properly. You also have to show that Lim(nx>inf)R_(n+1)->0.

Link [Will Automatically Download or open PDF depending on browser]: https://www.google.com/url?sa=t&source=web&rct=j&url=https://math.dartmouth.edu/archive/m8w10/public_html/m8l02.pdf&ved=2ahUKEwiA0tbvlu7iAhVhw1kKHfiGDCoQFjAAegQIARAB&usg=AOvVaw0_Wr6Qd57AWe5hQwk7gZt5

I wrote this answer on my phone, hence the trash quality, but hopefully you will find it useful.