The Taylor series expansion of $f(x)=e^x$ in $a=1$ is:
$T(x,a=1)=\sum_{n=0}^{\infty}\frac{e\left(x-1\right)^{\ n}}{n!}$
The interval of convergence using the absolute ratio test is $x\in(-\infty,\infty)$
However, if we evaluate the series in $x=1$ we obtain:
$T(x=1,a=1)=\sum_{n=0}^{\infty}\frac{e·0^{\ n}}{n!}=0$
Instead of $f(x=1)=e^1=1$
What is wrong with this calculation?