If you want to prove that the limit of $f(x)$ as $x$ to $a$ is equal to $L$ using the epsilon-delta definition of the limit, you need to solve the inequality
$$|f(x)-L|<\epsilon$$
for $x$, getting it into the form
$$|x-a|<\delta$$
for some $\delta$, which will in general be a function of $\epsilon$.
My question is, is there some way to calculate the function $\delta(\epsilon)$, short of solving the inequality above using the function $f$ you have?
Is it at least possible if $f$ is sufficiently well behaved? Like if $f$ is differentiable, can you calculate $\delta(\epsilon)$ using the derivative of $f$?
EDIT: This journal paper shows a formula for polynomials. If $f(x) = \sum_{n=0}^{k} a_n (x-a)^n$, then to prove that the limit of $f(x)$ as $x$ goes to $a$ equals $f(a)$, we can let $\delta = min(1,\frac{\epsilon}{ \sum_{n=1}^{k} |a_n|})$.
Can this be generalized to Taylor series? If $f(x) = \sum_{n=0}^{\infty} a_n (x-a)^n$, then can we prove that the limit of $f(x)$ as $x$ goes to $a$ equals $f(a)$ by letting $\delta = min(1,\frac{\epsilon}{ \sum_{n=1}^{\infty} |a_n|})$ ?
Below I deal with the power series question. I'll use your notation and assume WLOG that $a=0.$
Here's a simple solution to the general $\delta = \varphi(\epsilon)$ question that uses a different idea. Suppose the radius of convergence of the series is $r\in (0,\infty).$ Then
$$f'(x) = \sum_{n=1}^{\infty}na_nx^{n-1},\,\,|x|<r.$$
Define $D=\sum_{n=1}^{\infty}n|a_n|(r/2)^{n-1}.$ Then for $|x|<r/2,$ the mean value theorem gives
$$|f(x)-f(0)| = |f'(c_x)||x| \le D|x|.$$
Thus $\delta = \min(r/2,\epsilon/D)$ is a solution.
Note that since $r = 1/\limsup |a_n|^{1/n},$ we really do have a formula for $\delta $ as a function of $\epsilon$ that depends only on the coefficients $a_1,a_2, \dots.$ Note also that in the case $r=\infty,$ we can replace $r/2$ by $1$ in the above, and everything goes through.
Now to your specific question: Does $\delta = \min(1,\epsilon/(\sum_{n=1}^{\infty}|a_n|))$ work? The answer is yes, assuming $\sum|a_n| < \infty.$
Proof: Because $\sum|a_n| < \infty,$ the power series defining $f$ has radius of convergence at least $1.$ Let $\epsilon>0.$ Set $\delta = \min(1,\epsilon/(\sum_{n=1}^{\infty}|a_n|)).$ If $|x|<\delta,$ then
$$|f(x)-f(0)| = |\sum_{n=1}^{\infty}a_nx^n|\le \sum_{n=1}^{\infty}|a_n||x|^n$$ $$ = |x| \sum_{n=1}^{\infty}|a_n||x|^{n-1} \le |x| \sum_{n=1}^{\infty}|a_n| <\epsilon.$$
This result covers all cases where the radius of convergence is greater than $1.$ But obviously the result fails if $\sum|a_n| = \infty.$ Here we are in the case where the radius of convergence $r$ is a number in $(0,1].$ This can be handled by scaling into the $\sum|a_n| < \infty$ situation, and then scaling back. But the answer isn't as simple in this case. Since Micah's answer already covers this argument, I'll omit it here. (Note that the first method I mentioned, involving $f'(x),$ does not require this scaling argument.)