List of definitions of the various types of ODE stability?

704 Views Asked by At

I am studying numerical methods for solving ordinary differential equations (ODEs) and I keep on coming across different kinds of stability. I'm struggling to find the common thread between them.

I was hoping that someone who knows this topic better than I do could draft a list of different kinds of stability of numerical ODE solvers along with their definitions. I'm guessing some are synonyms of each other. And others may be relevant only to the model problem. I'm just looking to discriminate which are which.

Examples of types of stability I've come across include the following. However, this list is probably not exhaustive.

  1. Stability
  2. Zero-Stability
  3. A-Stability
  4. Absolute Stability
  5. Relative Stability
  6. Weak Stability

Thank you!

1

There are 1 best solutions below

0
On BEST ANSWER

per Hairer/Wanner: "Solving ODE II: Stiff and DAE problems"

Stability of numerical methods

The common thread of stability is that if your ODE system has an attracting sub-manifold, that then you expect also the numerical solution to converge toward that sub-manifold. The most simple case is that of a homogeneous linear system that is stable at the origin. Then the numerical method should respect that behavior.

A-stability

In the most simple instance this leads to the test equation $y'=\lambda y$, $λ\in \Bbb C$, $Re(λ)\le 0$. For one-step methods this leads to the propagation $y_{n+1}=R(λh)y_n$, with $R$ a polynomial for explicit methods and a rational function for implicit methods. The combination of ODE, method and step size is stable if $|R(λh)|<1$.

A-stability: A one-step method in itself is called A-stable if $|R(z)|\le1$ for all $z\in\Bbb C^-=\{z:Re(z)\le0\}$

For multi-step methods $$ y_{n}+\sum_{j=1}^q a_{n-j}y_{n-j} = h\sum_{j=0}^q b_jf_{n-j}=λh\sum_{j=0}^q b_jy_{n-j}, $$ this analysis is a little more involved, as one gets a difference equation of order greater than 2. Using a generating function $Y(t)=\sum y_jt^{-j}$, one gets the solution as $$ Y(t)=\frac{c(Y_0,λh)(t)}{a(t)-λhb(t)}=\sum\frac{A_k(Y_0,λh)}{t-r_k(λh)}, $$ where in the middle $a(t)=t^q+a_{q-1}t^{q-1}+...a_0$, $b(t)=b_qt^q+...+b_0$ and $c(Y_0,λh)(t)$ is a polynomial formed from the initial values $Y_0=(y_0,...,y_{q-1})$. The right side it the partial fraction decomposition using the roots $r_k(z)$ of $a(t)-zb(t)$. The solution is falling to zero if all $r_k(z)$ are inside the unit circle for $z=λh$. If one weakens the demand to non-expanding behavior, then the condition is $|r_k(z)|\le 1$ and for multiple roots $|r_k(z)|< 1$. Call the set of all such $z$ the stable set $S$.

A multi-step method is A-stable if $\Bbb C^-\subset S$.

Zero-stability

is a pre-condition for stability. It is only relevant in the case of linear multi-step methods, one-step methods have this automatically. It says that if the right side is zero, the numerical solution should be constant or nearly so, even under small perturbations, small additions of random noise in each step.

More precisely, among the $r_k(0)$, exactly one has value $1$ ($a(1)=0$ is necessary for consistency), all other have to be inside the unit circle. Failing that one gets one $q=r_k(0)$ with $|q|>1$, where then local errors get propagated from step $k$ to step $k+\Delta k$ with coefficient $q^{Δk}=q^{Δt/h}$, or a multiple root at $1$, which gives error growth like $Δk=Δt/h$ or even higher degree over time spans $Δt=t_{k+\Delta k}-t_k=Δk\,h$, which grows the faster the smaller $h$ is. Such behavior is in general not desirable.

A$(\alpha)$-stability

Any A-stable multi-step method has order 2 or less. To get higher order methods and still have a similar quality criterion one needs to weaken the definition of A-stability. The most common variant is

A multi-step method is A$(\alpha)$-stable if $S\supset\{z\ne 0:|\arg(-z)|<\alpha\}$


L- and I-stability

In one-step methods, the rational function $R(z)$, if it has a bounded value at $z=-\infty$, has the same value for all limits with $|z|\to\infty$. Thus while it may seem intuitive to have $|R(iy)|=1$, this would imply that that $|R(x)|$ can not fall to zero for $x\to -\infty$ on the negative real axis.

L-stable if A-stable and $\lim_{|z|\to\infty}R(z)=0$.

One can also be interested in the fraction $R(z)/e^z$ and where its modulus is smaller/equal/greater than 1.

I-stable if $|R(iy)|\le 1=|e^{iy}|$.

I-stable and all poles of $R$ in the positive half-plane implies A-stable.