Lyapunov stability and gradient

380 Views Asked by At

I'm having trouble to undertsand a basic example of the use of lyapunov stability. I'm trying to determine if the equilibrium point of a differential equation is stable.

Let a dynamic system defined by a first order differential equation : $\dot{x} = -x - x^3$. We have $x = 0$ for only equilibrium point.

Let a function $V(x) = x^2$ ; strictly positive except on the equilibrium point $x = 0$. The derivative of this function is $\dot{V}(x) = \dfrac{\partial V}{\partial x} \dot{x}$ with $\dot{x} = -x - x^3$.

$\dot{V}(x) = 2x (-x - x^3) = -2x^2 - 2x^4$. Thus, $\dot{V}(x) < 0$ for $x \neq 0$

From the Lyapunov theorem, the state of the system shoud converge to 0 from any initial state.


I don't understand where $\dot{x}$ come from in $\dot{V}(x) = \dfrac{\partial V}{\partial x} \dot{x}$. Why are we doing a partial derivative in this example and why $\dot{x}$ is here ?

Can we say that the derivative of $V(x)$ is $\dfrac{\partial V}{\partial x} \dot{x}$ ? Shouldn't it be only $\dot{V}(x) = 2x$ ?

1

There are 1 best solutions below

0
On

The notation $\dot x$ stands for the function given by $t \mapsto \frac{d}{dt}x(t)$, and this rule extends to functions depending on $x$ and $t$ similarly: $\dot V$ stands for the function given by $t \mapsto \frac{d}{dt} \left( V(x(t)) \right)$, which, by the chain rule, expands to $t \mapsto \left(\frac{d}{dx} V \right)(x(t)) \frac{d}{dt} x(t) = \left(\frac{d}{dx} V \right)(x(t))\dot x(t)$.