When approximating a function using a Taylor series, can your approximation get worse by adding the next term?

126 Views Asked by At

I’ve seen the remainder theorem for a Taylor approximation of a function. I’ve also seen that the at the limit when n approaches infinity, the remainder goes to zero.

I was wondering if when approximating a function using a Taylor series, is it ever possible that adding just the next term makes your approximation actually get worse? Do you have to solve the remainder equation to know this for each individual case, or is there some general rule to this?

Does anyone have an example where this is the case?

1

There are 1 best solutions below

0
On

There is no doubt that the Taylor formula is: $f(x)=\sum_{n=0}^{\infty}\frac{1}{n!} \frac{d^{n}f(x_{Taylor})}{dx^{n}}(x-x_{Taylor})^{n}$

Where f(x) is both the given function to be developed into a Taylor series and the short name for the Taylor series approximating or representing $f(x)$.

To consider or validate how far the approximation or representation holds there is convergence radius calculable.

For the given example $f(x)=x-x^2$ first decide at which point to develop.

$x=0$:

$f(x=0)=0, f'(x)=1-2x, f'(x=0)=1, f''(x)=-2, f''(x=0)=-2, f'''(x)==0$ and so on for higher orders. So the Taylor series is finite as is usually for polynomials with integer positive exponents only. So $f(x)=f(x,x=0)=0+1 x+(-2)/2 x^{x}=x-x^{x}$. So the Taylor series of positive integer exponent only polynomial is usually the polynom itself.

The is the positive tautological example.

Look for example at $g(x)=\sqrt{x}$. You see from scratch the derivative are infinitely unequal zero. If You try develop the Taylor series at $x=$ You fail. But for values larger than $0$ there is a Taylor series. But this series fail to approximate the square root function.

$x=1$

$g(x)=\sqrt{x}=\sum_{n=0}^{\infty}\frac{1}{n!} \frac{d^{n}\sqrt{x_{Taylor}}}{dx^{n}}(x-x_{Taylor})^{n}$

$\frac{d^{n}\sqrt{x_{Taylor}}}{dx^{n}}=x^{0.5-n}$

These terms do not exist if $x_{Taylor}=0$ and for $x_{Taylor}\rightarrow0$ the diverge each. So the example is continous, real and inifitely time differentiable but has a problem for approximating the function if $x_{Taylor}\rightarrow0$. The larger the $x_{Taylor}$ is taken the more range bigger than $0$ can be approximated nicely but never $0$ itself.

This is a simple example of a polynom with real exponents that offers problems with the Tayler series. The same occurs for functions that have there $0$ at other points $\sqrt{x-a}$, $a$ real.

For definitions look at Radius of convergence.

With rational functions singularities occur. Between these a Taylor series approximation may be useful but never at the poles. Similar with transcendent functions in nominators of quotients and so one. The range of composite function is large. Remember inverse functions as well. Lots of these case can be transformed into representations where the Taylor series converges again. Pages like this Taylor series offer table with the most important Taylor series.

This one offers links to further online available collections and tools to assist calculations.