Proving there is an assymptotical solution when Jacobian at singularity has a negative eigenvalue

71 Views Asked by At

Consider the ODE $x'(t)=f(x(t))$ (in an open subset of $\mathbb{R}^n$) with a singularity at $x_0\in\mathbb{R}^n$.

I want to prove that if the jacobian matrix $A=f'(x_0)$ has an eigenvalue with negative real part, then there is a non-trivial solution such that $\lim_{t\to+\infty}x(t)=x_0$.

If $A$ is hyperbolic (all eigenvalue have non-zero real part), it follows from the Hartman-Grobman Theorem.

What if it isn't hyperbolic?

I know that $-f(x_0)$ will have an eigenvalue with positive real part, and from this we can conclude $x_0$ is an unstable singularity of the ODE $x'(t)=-f(x(t))$ (which has the same solutions with inverted direction).

I was trying to show that this (being unstable with respect to $-f$), is sufficient for existing a solution as wanted, but couldn't prove it (and don't even know if it is true). (Edit: it is not true)

How could I procede?