Let be $u : \mathbb{R} \to \mathbb{R}$ a continuous function such that: $\lim_{t \to \pm \infty} u(t) = +\infty$.
We consider the following ODE:
$$
y''(t) + y'(t) + u'(y(t)) = 0
$$
I can prove that:
If $\lim_{t \to +\infty} y(t) = l \in \mathbb{R}$, then $u'(l) = 0$.
$\lvert y \rvert$, $\lvert y' \rvert$, $\lvert y'' \rvert$ are bounded.
I'd like to show that property 1. always holds, and here's how I tried to proceed:
If $y$ has a finite number of zeros, then $\lim_{t \to +\infty} y(t)$ exists (and must be finite by boundness).
So, heuristically, let's suppose $y$ do not converge, then it must oscillate and will have an infinite amount of local extremas, thus leading to an infinite amount of zeros for $y'$, and I'd like to say: "due to some relation, this gives rise also to an infinite amount of zeros for $y$", but I don't see how.
My question is two-fold:
- How to get 1.?
- How to think about those kinds of nonlinear ODE relative to their asymptotics? Is there any insights I should always keep in mind when I look to those problems?