Let $f \in C^{\infty}[1,2]$ be a function we would like to approximate, let be $g$ such approximation, you can assume $g$ is a spline function (at least quadratic). In literature I have seen that a bound is usually given in the approximation, however such bound hold "asymptotically" namely if $h$ is the mesh size in an uniform interpolation the bound given holds when $h \rightarrow 0$. So I was planning to find such max error using numerical optimization algorithms, however I don't have a big understanding, but I have literature I could consult... so I just need a clue on what technique could I look at. So how do I find
$$ \epsilon_M = \max_{x \in [1,2)} |f(x) - g(x)| $$
My attempt, as algorithm design, would be:
- Find the derivative $\left(f(x) - g(x)\right)'$.
- Find all the solutions of the equation $\left(f(x) - g(x)\right)' = 0$, using newton raphson for example.
- Say $\left\{x_k\right\}_k$ are such solutions, $\epsilon_M = \max_{k} |f(x_k) - g(x_k)|$
Further assumptions:
a. You can assume $f(x)$ is convex in $[1,2)$
b. The spline is interpolant, and assume degree is 2 or 3.