What happens when I convert a Taylor series into an integral?

572 Views Asked by At

Suppose we have the Taylor series of an analytic function as follows:

$$f(x) = \sum_{k=0}^\infty \frac{1}{k!} a_k x^k$$

Then I decide to (kind of) turn it into an integral:

$$g(x) = \int_0^\infty \frac{1}{\Gamma(k+1)} a(k) x^k \, dk$$

Clearly, $f(x) \neq g(x)$. But the values the two functions produce are somewhat close to each other. What's the relation between the two?

2

There are 2 best solutions below

0
On

I think you can use the Abel's summation formula to get the equality between the two functions as it allows us to transform a series to integral.

0
On

I'm assuming that you define $a(k)=a_{[k]}$, where $[k]$ is the floor function. You could similarly define it with a ceiling function and the answer should remain the same. First review the statement and proof of the integral test.

The gamma function is strictly increasing in $k$, and lets assume $a_kx^k/k!$ is decreasing (it has to go to zero regardless), so in this case we get something like:

$$f(x)-a_0=\sum_{k\geq 1}\frac{a_k}{k!}x^k\leq \int_0^\infty \frac{a(k)}{\Gamma(k+1)}x^kdk \leq \sum_{k\geq 0} \frac{a_k}{k!}x^k=f(x).$$

Note that if the terms are not monotonically decreasing then you'll get considerably more deviation.

Be care with the $a_0$ term. If $a_0=0$ for example, then by our monotonicity assumption this would mean all other terms are zero. So if you want to correct for this, you need to further offset the sum to do the integral/sum comparision.