By the error function for the sum $$\sum_{i = 0}^\infty \frac{1}{i!},$$ I mean the function $$f : \mathbb{R}_{> 0} \rightarrow \mathbb{N}$$ defined as follows.
For each $\varepsilon \in \mathbb{R}_{>0}$, $f(\varepsilon)$ is the least natural number $n$ such that $$\left|e - \sum_{i = 0}^n \frac{1}{i!}\right| < \varepsilon$$
More generally, any convergent series has a corresponding error function.
Question. Regarding the above series for Euler's number $e$, is it known how to compute $f(\varepsilon)$ when $\varepsilon$ is an explicitly known rational number written as a quotient of two coprime integers?
From Taylor's theorem, you know that
$$ e - \sum_{i=1}^n \frac{1}{i!} = \dfrac{e^{\xi_n}}{(n+1)!}, \quad \xi_n \in [0,1], $$
so your condition would be equivalent to
$$ \dfrac{e^{\xi_n}}{(n+1)!} < \varepsilon $$
Now, is the best case scenario, setting $\xi_n = 0$ we would get $$ (n+1)! > \frac{1}{\varepsilon} $$
and in the worst case scenario, $\xi_n=1$ we would get $$ (n+1)! > \frac{e}{\varepsilon} $$
Solving these inequalities will provide bounds for $f(\varepsilon)$ but in general will not give you the best $n$ that you are looking for.