Say I have a function $f(x)$ and I want to find an approximation $\hat{f}_n(x)$ of this function (e.g. as a polynomial, piecewise polynomial, truncated basis expansion, etc.) which is constructed by evaluating $f$ at $n$ points. Say also that $f$ is monotone increasing, clearly, a piecewise linear $\hat{f}_n$ will also be monotone increasing. What is the best method to approximate $\hat{f}_n$ which is at least twice continuously differentiable and also monotone increasing? What if, in addition, $f$ is convex, what is the best way to ensure that $\hat{f}_n$ is also convex?
For context, my function $f(x)$ arises from numerically evaluating an integral. I want to be able to evaluate $f$ rapidly in the inner loop of an optimization problem, so I'd like to approximate it by something that can be evaluated more efficiently, however, the approximation needs to inherit the convexity and monotonicity from $f$.
Not enough reputation to comment, so I'm putting this as an answer even though it's not a full one. Using Akima interpolation, which is built from piecewise 3rd-order polynomials, almost gets what you want. It's once continuously differentiable (not twice) and if you add some constraints, it can be monotone increasing.
As for guaranteeing convexity, I'm not sure. Since convexity is equivalent to having a derivative that is monotonically non-decreasing, you might be able to enforce something like that on the interpolating polynomials you use (like you already mentioned, piecewise linear satisfies this). One way to enforce this might be to require that all polynomial coefficients are positive and x is always positive.