If we have an $(n+1)$-times differentiable function $f:\mathbb{R}\rightarrow\mathbb{R}$, I know that $$f(x)=\sum_{j=0}^n\frac{f^{(n)}(a)}{n!}(x-a)^n+\frac{f^{(n+1)}(\xi_{a,x})}{(n+1)!}(x-a)^{n+1},\quad \xi_{a,x}\in [a,x]$$ (this is Taylor's Theorem).
If $f:\mathbb{R}\rightarrow\mathbb{C}$, I know that that the previous formula may not hold: take $f(t)=\cos t+i\sin t$, then $f(2\pi)-f(0)=0$ and $f'(t)=-\sin t+i\cos t$ has modulus $1$.
However, I do not understand why in some proofs involving functions from $\mathbb{R}$ to $\mathbb{C}$, people keep using Taylor's theorem. For example, in the proof of the Central Limit Theorem of probability by means of characteristic functions:
Central Limit Theorem: If $\{X_n\}_n$ is a sequence of i.i.d. random variables, with finite mean $\mu$ and finite variance $\sigma^2$, then $$\frac{\frac{1}{n}\sum_{k=1}^nX_k-\mu}{\sigma/\sqrt{n}}\stackrel{L}{\longrightarrow} N(0,1).$$
Proof: Notice that $$Y_n=\frac{\frac{1}{n}\sum_{k=1}^nX_k-\mu}{\sigma/\sqrt{n}}=\frac{1}{\sqrt{n}}\sum_{k=1}^n Z_k,\quad Z_k=\frac{X_k-\mu}{\sigma}.$$ Then $$\varphi_{Y_{n}}(t)=\varphi_{Z_{1}}\left(\frac{t}{\sqrt{n}}\right)^n,$$where $\varphi$ is the characteristic function. Then one writes $$\varphi_{Z_{1}}(t)=1-\frac{t^2}{2}+(1-E[Z_1^2e^{iZ_1\xi_t}])\frac{t^2}{2},\quad \xi_t\in [0,t],$$and uses the fact that $\lim_n E[Z_1^2e^{iZ_1\xi_{t/\sqrt{n}}}]=1$ by dominated convergence to conclude.
What makes Taylor approximation work in the context of complex functions is that we only need the fact that $$\lim_{x\to a} \left(x-a\right)^{n+1}\left| f(x)-\sum_{j=0}^n\frac{f^{(n)}(a)}{n!}(x-a)^n\right|=0.$$ In particular, we do not focus on an exact expression of the remainder. By the way, you can also express the remainder with an integral instead of a specific value of $f^{(n+1)}$.