Let $f:\mathbb R\to\mathbb R$ be a Lipschitz continuous, monotone increasing function, with $f(0)=0$, if a function $\phi$ satisfies;
$\displaystyle\cases{ \phi'(t)=f(-\phi(t))-f(\phi(t)) & \cr \phi(0)=1 }$,
why must be $\phi(t)>0$ $\forall t\in\mathbb R\ ?$
Actually this is not the exercise, only a part of it, which I don't understand, any help is appreciated.
The function $G\colon \mathbb{R}\to \mathbb{R};\; G(u) = f(-u) - f(u)$ is Lipschitz continuous with $G(0) = 0$.
So the solutions to the differential equation
$$y' = G(y)\tag{$\ast$}$$
to a given initial condition are unique by the Picard-Lindelöf theorem. It is easily verified that $\psi(t) = 0$ is a solution of $(\ast)$ for the initial condition $y(t_0) = 0$. Hence any solution of $(\ast)$ that is nonzero at some point $t_0\in\mathbb{R}$ is nonzero on its complete domain of existence [which is $\mathbb{R}$, again by Picard-Lindelöf].
The given $\phi$ is positive at $t_0 = 0$ by the initial condition, it is a solution of $(\ast)$ by assumption, hence it is positive on all of $\mathbb{R}$ by the uniqueness of solutions.