I read the book A Linear Systems Primer by Antsaklis, Panos J., and Anthony N. Michel (Vol. 1. Boston: Birkhäuser, 2007) and I am confused by the definition of stable. It is defined as:
\begin{equation} \dot{x}=f(x), f(0)=0\tag{4.8} \end{equation} Definition 4.6. The equilibrium $x = 0$ of (4.8) is said to be stable if for every $\epsilon > 0$, there exists a $\delta(\epsilon)>0$, such that \begin{equation} ||φ(t, x_0 )|| < \epsilon ~for ~all~ t ≥ 0 \end{equation} whenever $||x_0||<\delta(\epsilon) $
In this definition, does it mean that $ \lim_{t\to\infty} φ(t, x_0 ) = 0$ because $\epsilon$ could be any arbitrarily small positive number? If this is true, then the definition of stable equals to that of asymptotically stable which should NOT be correct. So what is the wrong part in my thought?
Generically, for a stable system your $\delta$ can be chosen to be a strictly increasing function, with $\delta(0)=0$. If $\lim_{\epsilon \to 0} \delta(\epsilon)=0$ and $x_0 \neq 0$, then the condition $\| x_0 \|<\delta(\epsilon)$ can only be satisfied for $\epsilon>\epsilon_0$ for some fixed number $\epsilon_0$. Thus for such an $x_0$ your definition only gives you that $\| \varphi(t,x_0) \|<\epsilon_0$ which is now fixed, not arbitrary anymore.
To modify this definition to get asymptotic stability, you would need to change $\delta(\epsilon)$ to be defined by "if $\| x_0 \|<\delta(\epsilon)$ then $\| \varphi(t,x_0) \|<\epsilon$ for all $t \geq t_0(\epsilon)$". Moreover you would need this modified $\delta(\epsilon)$ to not go to zero (so that this condition actually continues to apply to some nonzero values of $x_0$ even as $\epsilon \to 0$).
More intuitively speaking, stability says that a system starting in some $\delta$-ball around the equilibrium will not leave a $\epsilon$-ball around the equilibrium (where perhaps $\epsilon>\delta$). Asymptotic stability says that a system starting is some $\delta$-ball around the equilibrium will converge to the equilibrium.