Let $F : \mathbb{R}\times [0,T] \to \mathbb{R}$ a smooth function. Suppose that the initial value problem
$$ \frac{dx}{dt} = F(x(t),t) $$ $$x(0) = \alpha \in \mathbb{R}$$
has a solution in $[0,T]$.
Why does is true that there exists $\epsilon_0 > 0$ such that the problem
$$ \frac{dx_{\epsilon}}{dt} = F(x_{\epsilon}(t),t) $$ $$x(0) = \alpha + \epsilon \in \mathbb{R}$$ has solution in $[0,T]$ for every $\epsilon \in (0,\epsilon_0)?$
I imagine that this is easy, but I was not able to solve it. I tried a pedestrian approach as follows:
Deny the thesis. Then for every $k > 0$ there exists $l_k < \frac{1}{k}$ such that the problem
$$ \frac{dx_{l_k}}{dt} = F(x_{l_k}(t),t) $$ $$x(0) = \alpha + l_k \in \mathbb{R}$$ has no solution in $[0,T]$. That is, there exists a family of ODE's that cannot be solved in $[0,T].$ But since $l_k \to 0$ as $k\to \infty$ the "limit equation" tend to the original one, that can be solved. And it must be a contradiction. The problem is that I am not able to formalize this argument. How to do so? Or, how can I prove (following other way, possible) to solve this problem?
It came from the proof of Weak Maximum Principle for PDE's.