When it comes to epsilon-delta proofs, almost all texts (if not all) appear to make a perfect choice of epsilon to begin with, such that when they work everything out we finally get say $$|f(x)-f(x_0)|<\epsilon$$ which is of course very neat.
However, I cannot fathom that everybody knows exactly what epsilon to choose, especially when dealing with a problem they are initially a little uncertain about. If you can see the proof from beginning to end at a glance, then sure I can understand how you could make the clever choice to begin with.
But realistically, for those more experienced here, do you always know to pick the right epsilon to begin with? Or do you muck around first, then modify the values so then it works out perfectly as in my above example?
Specifically I was working on the following problem:
Consider the metric space $C[a,b]$ with distance being defined as the supremum norm over the interval. Let $(f_n)$ be a sequence in our metric space that converges uniformly to a function $f$ on $[a,b]$. Prove that $f$ is continuous on $[a,b]$.
The solution, which motivated my question here, is below.

I think there's a bit of confusion as most of the answers/comments are illustrating the fact that for a typical $\varepsilon$-$\delta$-argument, you let $\varepsilon > 0$ arbitrary and then show that you can always find a suitable delta that keeps some expression below this $\varepsilon$. That is correct but it's not exactly what is happening in your proof. That is, it is more complicated in the sense that there are 'different epsilons' involved.
Your proof is a typical example of where you use certain properties (such as continuity of $f_n$ and the uniform convergence $f_n \to f$) to prove another property (the continuity of the limit function $f$). Now, these earlier, known properties also have $\varepsilon$-$\delta$-definitions and since you assume these properties hold, you know that for any $\varepsilon > 0$, there is a suitable delta that keeps some expression (related to the assumed property) below this $\varepsilon$.
Since these properties hold for any $\varepsilon > 0$, they also hold for $\varepsilon/2$, $\varepsilon/3$ etc. The reason why this comes in handy, is that the definition of the property you're trying to show, also ends in showing that some (other) expression remains below (any) $\varepsilon > 0$. If you use the same $\varepsilon$ for the other, assumed properties and you then use the triangle inequality at the end - like in your proof - you would end up with something like: $$\color{red}{|f(x)-f(x)| <} |f(x)-f_N(x)|+|f_N(x)-f_N(x_0)|+|f_N(x_0)-f(x_0)| < \varepsilon+\varepsilon+\varepsilon=\color{red}{3\varepsilon}$$ Now if you started the proof with "choose an arbitrary $\varepsilon > 0$", you now end up with the 'ugly' $3\varepsilon$ where you want only $\varepsilon$.
You could fix this by choosing an arbitrary $\color{blue}{\varepsilon'} > 0$ at the start and now defining $\color{blue}{\varepsilon'} = 3\varepsilon$ and the final expression also nicely fits the formal definition. Some consider this less elegant and start with $\varepsilon$ but then choose the epsilons you apply to the other, assumed properties as $\varepsilon/3$. You only know the "/3" will do the trick because you use the triangle inequality to split it up in three pieces.
Example
Elaborating on @quid's comment, it might be insightful to see the same mechanism applied to a simpler example.
Let $a_n \to a$ and $b_n \to b$, which means that for all $\varepsilon >0$, there exist numbers $N_1$ and $N_2$ such that: $$n > N_1 \implies |a_n-a| < \varepsilon \quad\mbox{and}\quad n > N_2 \implies |b_n-b| < \varepsilon \quad \quad \color{blue}{(*)}$$ Now choose an arbitrary $\varepsilon' >0$ and let $n > \max\left\{ N_1,N_2\right\}$, then: $$|(a_n+b_n)-(a+b)| \le |a_n-a|+|b_n-b| < \varepsilon+\varepsilon = 2\varepsilon = \varepsilon'$$ This works if you choose $\varepsilon = \varepsilon'/2$ in the definitions $\color{blue}{(*)}$.
Alternatively, you could just choose $\varepsilon >0$ and take $\varepsilon/2$ when you use the definitions of convergence of $a_n$ and $b_n$; you then neatly end up with $\ldots < \tfrac{\varepsilon}{2}+\tfrac{\varepsilon}{2} = \varepsilon$.