Let $f,f_n\colon\mathbb{R}\times(0,1)\to\mathbb{R}$ be local Lipschitz continuous functions and assume they form the right-hand-side of some ordinary differential equation's $\dot{x} = f(t,x)$ and $\dot{x_n} = f_n(t,x_n)$ with initial values $(t_n^0,x_n^0)$ and $(t^0,x^0)$ respectively. Moreover, let us assume that $f_n$ converges to $f$ (in some sense) and $(t_n^0,x_n^0)\to(t^0,x^0)$. From standard theory on ordinary differential equations we know that for each ode, we find a maximal time interval $(t_n^\text{min},t_n^\text{max})$ and $(t^\text{min},t^\text{max})$, respectively, where our solutions uniquely exists.
My question is, when do we have $t_n^\text{max}\to t^\text{max}$?
Concrete, I need the result for $f,f_n\colon\mathbb{R}_+\times(0,1)\to\mathbb{R}_+$ with $f_n\to f$ uniformly on all compact subsets of $\mathbb{R}_+\times(0,1)$. Intuitively, I think this should work as the solutions stay close to each other the whole time. Do you maybe have some literature for such questions?
Theorem 3.2 in P. Hartman's Ordinary Differential Equations, is what you are looking for. The result says $$ \limsup t^{\text{min}}_n\le t^{\text{min}}<t^{\text{max}}\le\liminf t^{\text{max}}_n. $$