Taylor series error: Interval of $\xi$

129 Views Asked by At

if we calculate the remainder of a taylor series then we will have an unknown variable $\xi$. We can still say that it must be between $x_0$ and $x$ ($x_0$ is the Approximation point).

So my script says $\xi∈(x_0,x)$ for $x>x_0$ and $\xi∈(x,x_0)$ for $x<x_0$ wich seems to be an incomplete definition to me: The case $x_0=x$ is missing. The error at $x=x_0$ should be zero, but is the error undefined at that point? Or why is there no case for $x=x_0$?

I did not find a single other source that gave an intervall for $\xi$.

So can anybody explain to me why $\xi$ is undefined for $x=x_0$ or where my mistake is?

1

There are 1 best solutions below

0
On

The statement "there exists $\xi \in (x_0,x)$ such that ...." is stronger than the statement "there exists $\xi \in [x_0,x)$ such that ....". In general the best we can say is that $\xi \in (x_0,x)$ (assuming that $x_0<x$).