Dynamic Systems: Adjustment time to perturbations

216 Views Asked by At

I am new to the field of dynamic systems and have what I feel is a pretty basic question. If I have a simple dynamic system $\dot{x}=-kx$ with one stable equilbrium point, and I move my solution away from the equilibrium point by a value $\Delta x$, how long will it take for my solution to converge back to the equilibrium point? Let's call this adjustmnet time ($\tau$). What happens if this is a nonlinear dynamic system, for instance $\dot{x}=-kx^3$? Then how long does it take?

Can someone:

  1. Point me to another questions if it has already been asked
  2. Point me to a reference where I can figure this out on my own
  3. Walk me through a solution
4

There are 4 best solutions below

0
On

Assuming $x$ is a real number, both of these equations are easy enough to solve. Let's look at the first one here. The equilibrium is at $x^* = 0$. The solution to the equation is clearly

$$x(t;c) = c\exp(-kt)$$

and choosing

$$x(0) = x^* + \Delta x = \Delta x$$

we have

$$x(t) = \Delta x \exp(-kt)$$

If we're curious about when it will reach equilibrium, we just set

$$x(t) = x^* = 0$$

and solve... or well, we would but that equation has no solution. However, our intuition at this point should lead us to the discover that, assuming $k > 0$, we will have $x(t) \to x^*$ as $t\to\infty$.

This is quite typical for non-trivial autonomous equations, so much so that there may be a theorem about it. Lyapunov stability posits that the above is true, but I'm not sure of any examples of non-trivial dynamical systems which have a finite time of flight to equilibrium, if such a thing exists.


For completeness, note that the nonlinear equation has solution

$$x(t) = \pm \left(\frac{1}{\Delta x ^2} + 2kt\right)^{-\frac{1}{2}}$$

which again never takes its equilibrium value, yet approaches it as $t\to\infty$. I should also note that $x(t) = x^*$ is only a necessary condition, not sufficient, as we would also need $x'(t) = 0$ to truly be at equilibrium.

0
On

If this is a 1D equation, the linear system is easy enough to solve:

$$x(t) = x_0 e^{-kt}$$

where $x_0 = x(0)$. Notice that the equilibrium is $x = 0$.

Your "moving away from equilibrium" is equivalent to setting a non-zero value for $x_0$. But notice that you'll never reach the equilibrium value in finite time.

This is an example of uniform asymptotic stability: given some tolerance, the solution will eventually be within this tolerance from the equilibrium. More precisely, given a tolerance $\epsilon> 0$, you'll find a time $T$ such that, for any instant $t \geq T$, you have $\|x(t) - x_{eq}\| \leq \epsilon$. But, as the example shows, you may never truly reach equilibrium.

For nonlinear systems, things can get very complicated. Chaos can happen, that is, small changes in initial conditions will be increased over time, leading to competely different behavior. The most famous example is probably the Lorenz system. It's a 3D nonlinear system that first arose in meteorological modeling. I recommend reading about it (I'm on my phone so it's already difficult to even type equations); Wikipedia is a good starting point, but almost any book will at least mention it.

0
On

Strictly speaking, a solution will never come to the equilibrium point; this process takes infinite time. A correctly posed question should sound like this: estimate the time for which a solution will come to a given neighborhood of the equilibrium point and stay there forever.

In order to estimate this time, it is not necessary to solve the system. It is enough to know its Lyapunov function.

Let $V(x)$ be a Lyapunov function of the system $$ \dot x= f(x),\quad x\in\mathbb R^n. $$ Suppose that we have some initial point $x(0)$ such that $V(x(0))=C$, where $C$ is some number. Let $\tau'$ be the time for which the solution crosses the level set $\Omega_{C'}$, i.e. $V(x(\tau'))=C'$, $C'<C$: enter image description here

To estimate the time it takes for a solution to move from the level set $\Omega_{C}$ to the level set $\Omega_{C'}$ we use the fundamental theorem of calculus. We know that $$ C'-C=V(x(\tau'))-V(x(0))=\int_0^{\tau'}\dot V(x(t))dt; $$ on the other hand, the set $M=\{ x:\; C'\le V(x)\le C \}$ is compact, thus, $\dot V(x)$ takes the greatest value $$\gamma= \max_{x\in M} \dot V(x)<0$$ on it. Therefore, $$ \int_0^{\tau'}\dot V(x(t))dt\le \gamma\tau' $$ and $$ \tau'\le \frac1{\gamma}\int_0^{\tau'}\dot V(x(t))dt=\frac{C'-C}{\gamma}= \frac{V(x(0))-V(x(\tau'))}{-\gamma}. $$ This is the estimate of the time for which the solution will reach the set $\{x\in\mathbb R^n:\; V(x)\le C'\}$. If we want to get the neighborhood of the origin that it will reach, then we can enclose this set in the sphere of the required radius.

0
On

Regarding the "walk me through" option:

The unique solution to the equation

$\dot x = -kx \tag 1$

with

$k > 0 \tag 2$

and initial position

$x(0) = \Delta x \tag 3$

is

$x(t) = \Delta x e^{-kt}; \tag 4$

this solution never actually reaches the value $0$, though it does become arbitrarily small for $t$ sufficiently large. Indeed, given

$0 < \epsilon < \Delta x, \tag 5$

the time $\tau$ at which

$x(\tau) = \epsilon > 0 \tag 6$

may be found by setting

$\epsilon = \Delta x e^{-k \tau}, \tag 7$

whence

$\dfrac{\epsilon}{\Delta x} = e^{-k \tau}, \tag 8$

and thus

$-k \tau = \ln \left ( \dfrac{\epsilon}{\Delta} \right ) = \ln \epsilon - \ln \Delta, \tag 9$

$\tau = -\dfrac{1}{k} \left (\ln \epsilon - \ln \Delta \right ) = \dfrac{1}{k} \left (\ln \Delta - \ln \epsilon \right ); \tag{10}$

we note that in accord with (5)

$\ln \Delta - \ln \epsilon > 0, \tag{11}$

consistent with the anticipated positivity of $\tau$. Of course with

$k < 0, \tag{12}$

the solution

$x(t) \to \infty \; \text{strictly monotonically as} \; t \to \infty, \tag{13}$

It should be observed that in the case $k > 0$ the system point $x(t)$ remains in the interval $(0, \epsilon]$ for all $t \ge \tau$, and that since we may take $\epsilon$ arbitrarily small, it follows that

$x(t) \to 0 \; \text{strictly monotonically as} \; t \to \infty; \tag{14}$

however, there is no finite $\tau$ for which

$x(\tau) = 0. \tag{15}$

If instead of (1)

$\dot x = -kx^3, \tag{16}$

we may write

$x^{-3}\dot x = -k, \tag{17}$

that is,

$\dfrac{d}{dt}\left ( -\dfrac{1}{2}x^{-2} \right) = -k, \tag{18}$

or

$\dfrac{d}{dt}\left ( \dfrac{1}{2}x^{-2} \right) = k, \tag{19}$

which we integrate 'twixt $0$ and $t$ in light of (3):

$\dfrac{1}{2}x^{-2}(t) - \dfrac{1}{2} (\Delta x)^{-2} = \displaystyle \int_0^t \dfrac{d}{ds}\left ( \dfrac{1}{2}x^{-2} \right) \; ds = \int_0^t k \; ds = kt, \tag{20}$

which after some easy algebraic maneuvering becomes

$x^2(t) = \dfrac{1}{(\Delta x)^{-2} + 2kt} = \dfrac{(\Delta x)^2}{1 + 2k(\Delta x)^2t}, \tag{21}$

and finally

$x(t) = \dfrac{\Delta x}{\sqrt{1 + 2k(\Delta x)^2t}}; \tag{22}$

it is easy to see that we still have

$x(0) = \Delta x, \tag{23}$

and

$\displaystyle \lim_{t \to \infty} x(t) = 0, \tag{24}$

again strictly monotonically. The time $\tau$ it takes $x(t)$ to reach $\epsilon$ from $x(0) = \Delta x$ satisfies

$\epsilon^2 = \dfrac{(\Delta x)^2}{1 + 2k(\Delta x)^2\tau}, \tag{25}$

from which

$1 + 2k(\Delta x)^2\tau = \dfrac{(\Delta x)^2}{\epsilon^2}, \tag{26}$

$\tau = \dfrac{1}{2k(\Delta x)^2} \left ( \dfrac{(\Delta x)^2}{\epsilon^2} - 1 \right). \tag{27}$

Finally, we note that when

$k = 0, \tag{28}$

we find from both (1) and (16) that for any $x \ge 0$

$\dot x = 0, \tag{29}$

i.e. every $x \ge 0$ is an equilibrium point.