What is the 'stability' of a critical point of a (1 dimensional) dynamical system?

202 Views Asked by At

What is meant by the term 'stability' when describing a critical point of a (1-dimensional) dynamical system?

Suppose $x_{0}$ is a critical point of such a system. Does the stability of $x_{0}$ simply refer to the sign of the values of the differential equation just above and below this point?

2

There are 2 best solutions below

4
On BEST ANSWER

Stability means that for any sufficiently close initial condition to your equilibrium point (better than "critical point"), the corresponding solution will stay close to the equilibrium point for all positive times.

For example, this happens with the equilibrium point $0$ of the equation $x'=0$. In particular, this shows that there is no need to have nonzero signs for the derivative above and below the equilibrium point.

0
On

The stability is referring to what will happen if you start "near" your critical point. That is, stability is talking about the behavior of points in a small neighborhood of x_0 in the dynamical system as t -> ∞.