Can someone explain the meaning of asymptotically stability in the following definition?

6.6k Views Asked by At

I am taking this from an online course note.

Given a system $\dot x = Ax$, we say that:

The origin is asymptotically stable if $x(t) \to 0$ as $t \to \infty \thinspace \forall x(0)$

I am confused about the usage of "origin" in this definition.

  • Where does "origin" show up in the definition?
  • What does "origin" mean precisely? Does it mean $\vec 0$? Can any other point be asymptotically stable?
  • What would the definition look like if it was asymptotically stable at some other point other than the origin?
  • What would be a physical analogy where given system is "asymptotically stable".
2

There are 2 best solutions below

0
On BEST ANSWER

First of all, note that your definition of asymptotic stability is not the same as it's usually told (moreover, the PDF for which you have provided the link doesn't have your version of definition too). Just be careful here.

Where does "origin" show up in the definition?

What does "origin" mean precisely? Does it mean $\vec 0$ ? Can any other point be asymptotically stable?

Yes, the origin just means a zero vector $\vec 0$. So, in definition the origin appears as the limit point of trajectory $x(t)$: $x(t) \rightarrow \color{red}{\vec 0}$. For linear systems $\dot{x} = Ax$ (if $A$ is non-degenerate matrix) the origin is the only possible equilibrium state, but general nonlinear system might have equilibria other than origin. But when you're concerned only with local analysis of behaviour near the equilibrium it's very convenient to place it at origin (you can always do the shift $x = u + x^\ast$, where $x^\ast$ is equilibrium coordinates and $u$ is the new phase state variable).

What would the definition look like if it was asymptotically stable at some other point other than the origin?

It would look almost the same. The equilibrium $x^\ast$ is (Lyapunov) stable iff $$\forall \varepsilon > 0 \; \exists \delta(\varepsilon) : \lvert x(0) - x^\ast \rvert < \delta(\varepsilon) \Rightarrow \forall t \geqslant 0 \; \lvert x(t) - x^\ast \rvert < \varepsilon. $$

And equilibrium $x^\ast$ is asymptotically (Lyapunov) stable iff it's (Lyapunov) stable and every trajectory $x(t)$ from some neighbourhood tends to $x^\ast$ (i.e., $x(t) \rightarrow x^\ast$ when $t \rightarrow +\infty$)

The definition of Lyapunov stability could be applied not only to equilibria, but to other kinds of motions. If motion is not an equilibrium, just put $x^\ast (t)$ into the definition instead of $x^\ast$.

What would be a physical analogy where given system is "asymptotically stable"

I'm not an expert in physical explanations but I'd try to explain the difference between "stable, but not asymptotically stable" (let's call it "just stable") and "asymptotically stable" following way. When you slightly change the initial condition from equilibrium to some point near it, these two kinds of equilibria respond differently: for asymptotically stable equilibrium "displaced" motion will get back to equilibria, but for just stable equilibrium the motion will be near the equilibria, but won't get back to it.

There are two nice examples in the PDF that you have provided. The first example is a pendulum without friction. As you can see on phase portrait, the origin is surrounded by family of closed trajectories. Thus the origin is Lyapunov stable. Since no trajectory in any neighbourhood tends to the origin, it's not asymptotically stable. Situation is different for pendulum with friction.

0
On

What would be a physical analogy where given system is "asymptotically stable".

A pendulum without friction is stable at its lowest point.

A pendulum with friction is asymptotically stable at its lowest point.