When is the Taylor series bounded by a partial sum?

240 Views Asked by At

Cliff notes: Is there a nice, general property of a Taylor series that shows when it is everywhere bounded by its partial sum. E.g., is there a nice/simple way to show that $$ \cos(x) = \sum_{i=0}^\infty \frac{(-4\pi^2 x^2)^i}{(2i)!} \geq \sum_{i=0}^{2k+1}\frac{(-4\pi^2 x^2)^i}{(2i)!} \qquad (*) $$ for all integers $k \geq 0$ and $x \in \mathbb{R}$, or say, that $$ \cos(x)\cos(x^2) = 1-x^2/2 - 11x^4/24 + \cdots \geq 1-x^2/2 - 11x^4/24 \qquad (**) \; . $$


It is often extremely useful not just to approximate a Taylor series by its partial sum, but to bound it. E.g., we very frequently use the inequality

$$e^{-x} = 1 - x + x^2/2 - x^3/6 + x^4/24 - \cdots \geq 1-x \; .$$

(I'm using $e^{-x}$ instead of just $e^x$ to stress the fact that the summands can be negative, since the inequality is completely trivial otherwise.)

We typically prove this from the convexity of $e^{-x}$, which immediately implies that $$ x^2/2 - x^3/6 + x^4/24 - \cdots $$ is non-negative. But, for even slightly more complicated cases, like the inequality $$ \cos(x) = 1- x^2/2 + x^4/24 - \cdots \geq 1-x^2/2 \qquad (***) \; , $$ this simple technique fails.

In fact, the best proof of $(***)$ that I know observes that the inequality is trivial except in a small interval around zero, in which the fourth derivative of $\cos(x)$ (which is just $\cos(x)$) is positive. My main gripe with this proof is that it gets quite ugly when we try to generalize it. Proving $(*)$ with this strategy is already a pain, and generalizing to even slightly less pretty functions like that in $(**)$ seems like a nightmare.

Is there a nice way to handle such things?

2

There are 2 best solutions below

0
On BEST ANSWER

I guess the original question that I asked was far too general and vague, but with Siyao Guo and Alex Lombardi we at least found a cute proof that, e.g.,

$$1 - x + \frac{x^2}{2} - \frac{x^3}{6} + \cdots - \frac{x^{2k-1}}{(2k-1)!} \leq e^{-x} \leq 1-x + \frac{x^2}{2} + \cdots + \frac{x^{2k}}{(2k)!} \; $$

for integer $k \geq 1$ and $x \geq 0$. We can similarly prove bounds for $\cos(x)$ and $\sin(x)$. I doubt this is original, but since I couldn't find it by Googling, I figured it was worth posting.

Equivalently, we want to prove that the tail of the series satisfies $$f_k(x) := \sum_{i=0}^{\infty} (-1)^i \frac{x^{k+i}}{(k+i)!} \geq 0 \; $$ for $x \geq 0$. The proof is by induction on $k$. The base case is $k = 0$, which is trivial since $f_k(x) = e^{-x}$. For $k \geq 1$, we simply note that the derivative satisfies $\frac{\rm d}{{\rm d} x} f_k(x) = f_{k-1}(x)$. By induction, the derivative is positive for $x \geq 0$, and the result follows from the fact that $f_k(0) = 0$.

The proof for $\cos(x)$ and $\sin(x)$ is essentially the same and works by just replacing $f_k$ by

$$g_k(x) := \sum_{i=0}^\infty (-1)^i \frac{x^{k + 2i}}{(k+2i)!} \; ,$$

so that $g_0(x) = \cos(x)$, $g_1(x) = \sin(x)$, and $\frac{\rm d}{{\rm d} x} g_k(x) = g_{k-1}(x)$.

7
On

Is there a nice way to handle such things?

Not for all sums. But for sums coming from objects like sine and cosine expansions, you can use the bound coming from alternating series. More formally, if you are interested in $$ \sum_{n \geq 1} (-1)^n a(n)$$ and the $a(n)$ satisfy

  1. $a(n) \geq 0$
  2. $a(n) \geq a(n+1)$
  3. $a(n) \to 0$

then you have that

$$ a(1) - a(2) + \cdots - a(N) \leq \sum_{n \geq 1} (-1)^n a(n) \leq a(1) - a(2) + \cdots - a(N) + a(N+1)$$ for any $N$. This is commonly proved while proving the alternating series test for convergence.

In practice, it is so much easier to understand the convergence of alternating series that sometimes is is better to figure out how to understand how to decompose an object into alternating series.