Is there a formal definition of "continuous at a point for all practical purposes" or "continuous at a point within a given tolerance"?

50 Views Asked by At

Consider a function $f\colon D \to \mathbf{R}$ such that at some inner point $a$ of its domain $D\subseteq \mathbf{R}$ $$\lim_{x \to a^-}f(x) = L, \qquad \lim_{x \to a^+}f(x) =R, \qquad L \ne R \ ,$$ and the value $f(a)$ may be equal to either limit or have a yet different value.

Obviously the function isn't continuous at $a$.

But suppose that we are studying it with some numerical method, and the difference between $L$, $R$, and $f(a)$ is less than the precision used in the method. Say, for example, that $R=2+ \mathrm{e}^{-1000}$ and $L=2$. From the point of view of the numerical method the function can, so to speak, be considered continuous at $a$. A similar situation can arise with functions meant to fit empirical data.

Is there some notion of "approximate continuity at $a$" that applies in this case? An intuitive definition would be something like $\lvert \lim_{x \to a^-}f(x) - \lim_{x \to a^+}f(x) \rvert <\delta$ and $\lvert \lim_{x \to a^-}f(x) - f(a) \rvert <\delta$ and so on, for some tolerance $\delta>0$. But I have no idea whether some definition of this kind is used in computational mathematics and how it is formally called.

I'd be thankful for any references you know of about some notion similar to this.

Note: The expression "approximate continuity" I use here has nothing to do with the definition of approximate continuity connected with measure theory, of course; but I couldn't find a better expression. Also, I'm aware that in computable analysis all computable functions are continuous.

2

There are 2 best solutions below

3
On BEST ANSWER

You can define the oscillation of a function at a point as follows:

Let $f: I \to \Bbb R$ be a bounded function where $I \subset \Bbb R$. Let $a \in I$. Then, for $\delta > 0$, define

$$M(f, a, \delta) = \sup_{|x - a| < \delta}f(x) \quad\text{and}\quad m(f, a, \delta) = \inf_{|x - a| < \delta}f(x).$$

Then, the oscillation is defined as $$o(f, a) := \lim_{\delta \to 0^+}[M(f, a, \delta) - m(f, a, \delta)].$$

(This limit must exist. Why?)

Note that $f$ is continuous at $a$ iff $o(f, a) = 0$. Now, for your purpose, you can define "continuous within tolerance $\epsilon$" by saying that "$o(f, a) < \epsilon$ for all $a \in I$."


Edit: If you don't particularly care about the value of the function at $a$ itself, you can modify the definitions of $M$ and $m$ to take the $\sup$/$\inf$ over those $x$ such that $0 < |x - a| < \delta$.

5
On

So in engineering and applied mathematics, when two functions $f,g$ are equal in all but a finite number of points, it makes some sense to consider them the same function. In other words, almost everywhere (a.e.) equality is good enough. A more technical way of saying this is that in engineering fields, engineers and researchers like to work with functions modulo a.e. equality.

This is due to a lot of good reasons. First, the derivatives of $f,g$ are also a.e. equal. Most importantly, $\|f-g\|^2=\int_a^b |f(x)-g(x)|^2 dx=0$. this nicely implies that $f,g$ share the same Fourier series. It follows from these properties many real analysis theorems for functions hold when you use a.e. equality instead of equality. I mean think about it, they are the same thing except at this 1 infinitesimally tiny point. Other than that one point, the information of the function is fixed. Your line of thinking is correct.

There are definitely generalizations of continuity for a.e. equal functions. One example is piecewise continuity.