Could anybody please clarify the relationship between numerical stability and accuracy?

753 Views Asked by At

I was reading a paper and came up with this statement.

Stability merely avoids uncontrolled error growth but cannot guarantee actual numerical accuracy.

From what I understood from the concept of order of accuracy, the order of accuracy is the rate of convergence of a numerical approximation of a differential equation to the exact solution.

The larger the error, the more the numerical approximation won't converge to the exact solution. Am I correct?

But why a stable method where the error is controlled cannot guarantee numerical accuracy?

Could somebody please clarify a bit about this concept?

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

This is an issue of imprecise terminology. It comes down to what you mean by "controlled". Stability* is all about the numerical solution not diverging when the exact solution isn't diverging. That means the error is controlled in the sense that it is bounded (for $h$ in some interval $(0,h_c)$ and a fixed time interval $[0,T]$, say) but not that it is going to zero as $h$ goes to zero, which is what you need for accuracy. Indeed the trivial numerical method which just doesn't do anything at all (i.e. $x_{n+1}=x_n$, regardless of what the DE says) is stable but not accurate.

* Here I assume we are in the context of numerical methods for differential equations, not in the context of algorithms in floating point arithmetic. The latter also has several notions of "stability" which are also relevant to numerical methods for differential equations, in that they essentially put a lower bound on the step sizes that we can use without catastrophic amounts of roundoff error.