Existence of stable algorithms

151 Views Asked by At

Does every well-conditioned problem admit a stable algorithm?

This related question as well as Carl Christian's comments suggest that in its original form my question is probably too broad to have a satisfying answer. Therefore:

Consider the problem of evaluating a real expression $f$ (an elementary function, if you like) at $x\in \mathbb{R}$ using a fixed floating point number system. Assume that $x$ and $f(x)$ do not over- or underflow and that the relative condition number of $f$ at $x$, $$ \left| \frac{xf'(x)}{f(x)} \right|, $$ is close to $1$. Is there a problem of this type for which only unstable algorithms are known?

There are many textbook examples of well-conditioned problems, e.g. $f(x) = \log(1+x)$ at $x \approx 0$, for which the obvious 'algorithm' gives bad results, but a mathematically equivalent rewriting, in this case $\log(1+x) = 2 \, \mathrm{artanh} \, (x/(x+2)) $, leads to a stable algorithm. Can we be sure that such a trick exists?

These notes seem to give a positive answer to my original, broad question (2nd item in the summary), but they lack an explanation. I have also looked at Nicholas J. Higham's Accuracy and Stability of Numerical Algorithms, but was not able to find an answer.

I am also happy about (partial) answers to special cases other than evaluation of a real function.