Why am I getting this absurd result by integrating Taylor series?

106 Views Asked by At

Just like the integration of the Taylor series of $\,\sin x\,$ gives $\,\cos x +C. \,$ I did the same with the general taylor series. The integration of the taylor series of $f(x)$ around x=a is: $$f(a)x+\frac{(x-a)^2}{2}f^{'}(a)+......$$ which gives $a*f(a)$ at x=a.

I integrated the general Taylor series around $x=a$ of any function and evaluated it at $x=a.$ That gives $a\cdot f(a),$

which is absolutely false because it would mean that the value of the integral of any function $f(x)$ at $x=a$ is $a\cdot f(a)$.

And, it means that the definite integral of any function from $a$ to $b$ is $bf(b)-af(a)$ which is false.

So, what did I do wrong?

1

There are 1 best solutions below

2
On

You forgot the integration constant, which makes

$$F(a)=f(a)a+C$$ which is perfectly valid as it can take any value.

And when performing the definite integration, you assumed that $F(b)=f(b)b$, which is wrong.

The true relation is

$$F(b)-F(a)=\left.f(a)x+f'(a)\frac{(x-a)^2}2+f''(a)\frac{(x-a)^3}3+\cdots\right|_a^b\\ =f(a)(b-a)+f''(a)\frac{(b-a)^2}2+f''(a)\frac{(b-a)^3}3+\cdots$$