My understand of Taylor's theorem is:
Let $f:\mathbf{R} \to \mathbf{R}$ be $k+1$ times differentiable on the open interval with $f^{(k)}$ continuous on the closed interval between $a$ and $x$. Then $f(x) = P_k(x) + R_k(x)$
where
$P_k(x) = f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2 + \frac{f'''(a)}{3!}(x-a)^3 + \cdots + \frac{f^{(k)}(a)}{k!}(x-a)^k$
and
$R_k(x) = \frac{f^{(k+1)}(c)}{(k+1)!}(x-a)^{k+1}$ where $c$ is between $a$ and $x$.
At least this is one form of the theorem.
So if we apply this to something like $f(x)=e^x$ at $a=0,$ we get
$e^x = 1 + x + \frac{x^2}{2} + R_2(x)$
or
$e^x = 1 + x + \frac{x^2}{2} + + \frac{x^3}{3} + R_3(x)$
etc...
But I often see the equality $e^x$ with the infinite series $1+ x + \frac{x^2}{2!} + \cdots$.
How is something like this achieved? Is it a consequence of Taylor's theorem? I suppose we need something like $\lim_{k \to \infty} R_k(x) = 0$ for all $x$?
Taylor’s series are a special case of the more general case of Power series and you are correct, Taylor’s series is obtained by pointwise expansion which in general converge in an neighborhood of the expansion point but for some function it can converges for all x values (EG $e^x$, $\sin x$, $\cos x$...).