Lyapunov stability and finite time convergence

619 Views Asked by At

I have two questions:

Problem 1: Let $V (x)$ be the Lyapunov function candidate with $x \in \mathbf{R}$, and the time derivative of $V(x)$ is given by

$\dot{V} (x) \le - x ( x - \alpha (t) )$

where $| \alpha (t) | \le \alpha^{*}$, for all $t \ge t_{0}$, $\alpha^{*} > 0$. Then, for $|x| > \alpha^{*}$, we have $\dot{V} (x) < 0$.

Consequently, can we say that, for any $x \in \mathbf{R}$, $x$ converges to the interval $[ - \alpha^{*} , \alpha^{*}]$ in finite time ?

Problem 2: Let $|x (t) | \le z (t)$, for all $t \ge t_{0}$, $x(t) \in \mathbf{R}^{n}$, $z(t) \in \mathbf{R}_{\ge 0}^{n}$, and $\dot{z} (t) = A z(t)$, where $A$ is a Hurwitz and Metzler matrix.

Then, can we say that $x$ converges to the origin in finite time ?

2

There are 2 best solutions below

0
On

For the problem 1:

Example 1: Non-convergence in finite time

Consider the following system:

$$\dot{x} = -x(x - \alpha(t))$$

where $x \in \mathbb{R}$ and $|\alpha(t)| \leq \alpha^*$ for all $t \geq t_0$, where $\alpha^* > 0$.

Let's assume $\alpha(t) = \alpha^*$ for simplicity. Then the system becomes:

$$ \dot{x} = -x(x - \alpha^*) $$

To analyze the convergence behavior, we can examine the Lyapunov function candidate:

$$ V(x) = \frac{1}{2}x^2 $$

Taking the derivative of V(x) along the trajectories of the system:

\begin{aligned} \dot{V}(x) &= \frac{d}{dt}\left(\frac{1}{2}x^2\right) \\ &= x\dot{x} \\ &= -x^2(x - \alpha^*) \end{aligned}

Since $\alpha^*$ is a constant, we can rewrite the equation as:

$$ \dot{V}(x) = -x^2(x - \alpha^*) = -x^3 + \alpha^*x^2 $$

From this expression, we can see that $\dot{V}(x) < 0$ for $|x| > \alpha^*$, which implies that the Lyapunov function $V(x)$ is strictly decreasing for $|x| > \alpha^*$.

However, we cannot conclude that $x$ converges to the interval $- \alpha^*, \alpha^*$ in finite time. The negative derivative $\dot{V}(x) < 0$ for $|x| > \alpha^*$ only tells us that $V(x)$ decreases over time for values of $x$ outside the interval $- \alpha^*, \alpha^*$. It does not provide information about the convergence or the finiteness of time for reaching the interval.

For problem 2: Example 2: Stability but no finite-time convergence

Consider the following system:

$$ \dot{x} = Ax $$

where $x \in \mathbb{R}^n$, $A$ is a Hurwitz and Metzler matrix, and $x(t) \geq 0$ for all $t \geq t_0$.

Let's assume $n = 2$ for simplicity, and consider the following matrix:

A = \begin{bmatrix} -1 & 2 \\ -2 & -3 \\ \end{bmatrix}

The eigenvalues of matrix $A$ are $\lambda_1 = -0.7913$ and $\lambda_2 = -3.2087$, both of which have negative real parts. Therefore, A is Hurwitz and guarantees stability.

Now, let's assume $x(t)$ is a vector with non-negative components:

x(t) = \begin{bmatrix} x_1(t) \\ x_2(t) \\ \end{bmatrix}

We can define a Lyapunov function candidate as:

$$ V(x) = \frac{1}{2}\left(x_1^2 + x_2^2\right) $$

Taking the derivative of V(x) along the trajectories of the system

:

\begin{aligned} \dot{V}(x) &= \frac{d}{dt}\left(\frac{1}{2}\left(x_1^2 + x_2^2\right)\right) \\ &= x_1\dot{x}_1 + x_2\dot{x}_2 \\ &= x_1(A_{11}x_1 + A_{12}x_2) + x_2(A_{21}x_1 + A_{22}x_2) \\ &= x_1^2A_{11} + x_1x_2(A_{12} + A_{21}) + x_2^2A_{22} \end{aligned}

Since $A$ is a Hurwitz matrix, we can conclude that $\dot{V}(x) < 0$ for all $x \neq 0$. This implies that $V(x)$ is strictly decreasing along the trajectories of the system, indicating stability.

However, we cannot guarantee finite-time convergence to the origin. The stability property ensures that $(x(t)$ approaches the origin as $t$ tends to infinity, but it does not provide information about the finiteness of time for reaching the origin.

0
On

Your question is related to ultimate boundedness of a system. Obviously your system has the form $\dot x = -x + \alpha(t)$, and your Lyapunov function candidate is $V = 1/2\,x^2$. Then you might write $$ \dot V = -x^2 + x\,\alpha(t) $$. Interprate $\alpha$ as a nonvanishing perturbation. Your equilibrium of the non-perturbed system $\dot x = - x$ is exponentially stable since $\dot V = - 2 V \Rightarrow V = V(x(0))\exp(-2t)$. You continue with $$\dot V \leq -x^2 + \beta x^2 - \beta x^2 + \|x\|\alpha^*, \quad 0 < \beta < 1$$ $$ = -(1-\beta)x^2 - \beta x^2 + \|x\|\alpha^* $$$$ \leq -(1-\beta)x^2, \quad \| x \| > \alpha^*/\beta$$. Therefore, you are right with problem 1 except of $\beta$. You can support this result by looking for results of ultimate boundedness.

It converges not in finite time to zero. For something like $\dot V \leq - \sqrt{V}$ you have finite time convergence. Howerver, it converges in finite time in the interval $[−\alpha^*,\alpha^*]$.