Summary of wikipedia article
Let
- $f$ be "a mathematical problem",
- $\tilde f$ a numerical scheme replicating $f$,
- $x$ the "input" to the problem, and
- $\tilde x$ the "perturbed input".
Then
- $\Vert f(\tilde x)-f(x) \Vert$, is called the condition, describing how the problem changes subject to a perturbed input,
- $\Vert \tilde f(\tilde x)-\tilde f(x) \Vert$, is called the (numeric) stability, describing how the scheme changes subject to a perturbed input,
- $\Vert \tilde f(x)-f(x) \Vert$, is called consistency, describing how well the scheme solves the problem when provided with the exact input
- $\Vert \tilde f(\tilde x)-f(x) \Vert$, is called convergence, describing how well the scheme solves the original problem.
Source: This article on wikipedia in german language
At first glance, this looked like a fantastic overview - sadly I dont understand it.
My definition of Condition
The absolute condition number is the smallest number $\kappa$, such that $$\Vert f(\tilde x)-f(x)\Vert \leq \kappa \Vert x-\tilde x\Vert$$
In other words, I would say the article did not explain "condition" in detail, but the concept is correct.
My definition of Numeric stability
I was taught, that numerical schemes for ODEs/PDEs, that update a solution iteratively, i.e. algorithms described by an evolution operator, $E$, that computes a "new solution" $u^{n+1}$ as a function of $u^n$: $$u^{n+1} = E u^{n}$$
are stable, if the operator norm of the evolution operator, is bounded by 1: $$\Vert E \Vert \equiv \frac{\Vert E u^n\Vert}{\Vert u^n \Vert}\leq 1, \qquad \forall u^n$$
This is a measure for whether/how much "local" errors get amplified.
Q1: Is this definition of stability equivalent to the one from Wikipedia stated above? If so, how?
Choosing the letter $u$ for the unknown instead of $x$ was done intentionally, as I have a hard time making sense of the Wikipedia definition ("I dont know what they mean with $x$ - is it the unknown, or should it be a known variable, that we give as an 'input' to the problem?")
My definition of Consistency
In my understanding, consistency of a Finite Difference or Volume method (not FEM), is the residual one gets from evaluating the exact solution with the scheme and then subtracting the original equation. This is a measure for how much error is introduced "locally"(=in every step). For a scheme $N$ and a differential operator $L$, consistency of $N$ is given if $$N[u] - L[u] = \mathcal{E}(\Delta t, \Delta x)$$
where $u$ is the exact solution of $L[u]=0$ and the residual on the RHS, $\mathcal{E}$ (also called truncation error) converges to $0$ for $\Delta t, \Delta x\to 0$. From this we get the order of the scheme.
This definition seems to be consistent with Wikipedia, if we identify $N\equiv \tilde f$, $L\equiv f$, and $u\equiv x$. Note, that we have identified $x$ as the unknown, not as a known input to the problem, like suggested on Wikipedia.
My definition of Convergence
For linear Finite Difference or Volume methods, one has this result from Lax and Richtmayr, stating that $$\text{Stability and consistency}\iff \text{convergence}$$
If a method is convergent, it satisfies $$\Vert u - u_h \Vert \xrightarrow{\Delta x, \Delta t \to 0} 0$$
Q2: Is this definition of convergence equivalent to the one from Wikipedia stated above? If so, how?
I have no idea if Wikipedia suggests, that $x=u$ here? If so, then why do they define convergence as $\Vert \tilde f(\tilde x) - f(x)\Vert$? Or does Wikipedia want me to believe that $f(x)\equiv u$, to be consistent with "my" definition of convergence, but then this would contradict the usage of $x$ from the Wikipedia definition of consistency. It seems, from the Wikipedia definition, one can follow the Lax Richtmayr result. That would be great, if their notation would make sense.
Summary:
- The definitions of condition and consistency make sense to me,
- the ones for stability and convergence dont.