I'm looking at provable global error bounds of the Euler method for the first time and I was surprised to find that the bound grows exponentially in the amount of time (the domain size) propagated i.e.
$\delta_{err}\le A \;dt\; (e^{L (t_1-t_0)}-1)$
With $A$ a function of problem dependent bounds (e.g. the Lipschitz constant, $L$). At first, I thought this was extremely bad due to the exponential dependence on the domain size but then it occurred to me that I could rescale the time units such that $(t_1-t_0)=1$ but this seems like it might be problematic.
Going beyond Euler method to Runge-Kutta etc. methods doesn't help if the error bound grows exponentially with the domain size. As ODE is a well established field, and most references don't explicitly mention the exponential growth with the domain size I think it must be unimportant but I cannot see how (other than rescaling). Hence my question is:
Is there a way to be okay with this exponential dependence, either by rescaling such that $t_1-t_0=1$ or by any other means?