This is more of a general question.
We recently started to work with $\varepsilon(-\delta)$-proofs but then quickly went on to find propositions that make it easier to show a series, function etc. converges. The thing about these proposition is that they often can only be applied when certain criteria is met, whereas $\varepsilon(-\delta)$-proofs can be seen as the all-purpose method. And I think it would be pretty powerful to always be able to prove something with the $\varepsilon(-\delta)$ approach, in an exam for example.
But then again - it's (at least for me) pretty hard to find the fitting $\varepsilon$ or $\delta$ for some problems.
To those of you who are more experienced: Is there a method or heuristic that make it easier to find own $\varepsilon(-\delta)$-proofs or is it really just "seeing by pondering"?
Hint:
You have to find a function $\delta$ such that for a given $\epsilon$
$$|x-x_0|<\delta(\epsilon)\implies|f(x)-L|<\epsilon.$$
Assuming that $f$ is strictly increasing (hence invertible), the right inequality can be read as $$f^{-1}(L-\epsilon)<x<f^{-1}(L+\epsilon)$$
and a suitable $\delta$ is given by
$$\delta(\epsilon)=\min(|f^{-1}(L-\epsilon)-x_0|,|f^{-1}(L+\epsilon)-x_0|).$$
This also works when $f$ is decreasing.
If $f$ has extrema, making it non-invertible, you can constrain $\delta$ to be smaller than the distance to the nearest extremum.
In practice, you will simplify/approximate the expression of the inverse, making sure to get a tighter interval.