What is meant by the term 'stability' when describing a critical point of a (1-dimensional) dynamical system?
Suppose $x_{0}$ is a critical point of such a system. Does the stability of $x_{0}$ simply refer to the sign of the values of the differential equation just above and below this point?
Stability means that for any sufficiently close initial condition to your equilibrium point (better than "critical point"), the corresponding solution will stay close to the equilibrium point for all positive times.
For example, this happens with the equilibrium point $0$ of the equation $x'=0$. In particular, this shows that there is no need to have nonzero signs for the derivative above and below the equilibrium point.