Let $f:[a,b]\to\mathbb R$ be an infinitely differentiable function. Define the left (right) best-fit parabola as follows:
The left [right] best-fit parabola is the quadratic function $p(x)=Ax^2+Bx+C$ such that $p(a)=f(a),p(b)=f(b)$, and $p'(a)=f'(a)$ $\big[p'(b)=f'(b)\big]$.
A bit of algebra reveals that there is exactly one choice of coefficients $A,B,C$ (for each parabola) that makes the definition work. You can get a visual of these parabolas on this Desmos module and play around with them.
My question is, under what conditions do these two parabolas create upper and lower bounds for $f$ on $[a,b]$? Some experimentation in the above Desmos module reveals that this is often the case, but not always. For example, it is not the case for $f(x)=\tanh(x)$, shown here, even on intervals that are increasing and concave down.
These counterexamples seem to occur when the third derivative changes sign on the interval (as in the above case). So my suspicion is that the parabolas bound the function if the third derivative does not change sign on the interval. Is this conjecture valid, and if so, can someone help me prove it?
I will start with the converse. Below are the bounds for $\sin (2\pi x)$ and $\cos (2\pi x)$. As can be seen, both of them change the sign of the 3rd derivative, however, sin is bound and cos is not. So, clearly, your conjecture is not a necessary criterion. I still need to think over if it is sufficient
Here's another example I came up with
$$y(x) = (x-0.5)^5$$ and
$$y(x) = (x-0.5)^5 + 0.001 * \mathcal{N}(\frac{x-0.5}{0.001})$$
where $\mathcal{N}$ is the standard normal distribution. As can be seen, both functions are bound, but the latter changes sign in the 3rd derivative, while the former does not.
I can't seem to come up with a counter-example to the sufficiency criterion by trial and error. Perhaps it does hold. Here's an attempt to prove it. I will present as far as I get, perhaps somebody has an idea on how to continue
WLOG, assume that the interval of interest is $[0,1]$. We will use two bounding functions
$$f_1(x) = A_1 x^2 + B_1 x + C_1$$ $$f_2(x) = A_2 x^2 + B_2 x + C_2$$
The constraints are
$$ \begin{eqnarray} f_1(0) &=& C_1 = f(0) \\ f_1(1) &=& A_1 + B_1 + C_1 = f(1) \\ f_1'(0) &=& B_1 = f'(1) \\ \end{eqnarray} $$
and
$$ \begin{eqnarray} f_2(0) &=& C_2 = f(0) \\ f_2(1) &=& A_2 + B_2 + C_2 = f(1) \\ f_2'(1) &=& 2A_2 + B_2 = f'(1) \\ \end{eqnarray} $$
Solving the above constraints, we obtain
$$ \begin{eqnarray} f_1(x) &=& [f(1) - f(0) - f'(0)]x^2 + f'(0) x + f(0)\\ f_2(x) &=& [f(0) - f(1) + f'(1)]x^2 + [2f(1) - 2f(0) - f'(1)]x + f(0) \end{eqnarray} $$
This is where I get stuck. Why assuming $f(x)$ to be a 3rd anti-derivative of either positive or negative function enable it to be bound by some simple function of its endpoints. I feel like this idea resembles mean-value theorem and Green's theorem and/or Fundamental theorem of calculus, but neither really fits directly