I want to approximate a function through piecewise linear interpolation and try to understand how I could set the associated interval points optimally.
Take a continuous function $f: X \rightarrow \mathbf{R}$, with $X \subset \mathbf{R}$ being a closed an bounded subset with boundary points $min(X) = x_0 $ and $max(X) = x_{N+1}$. Suppose the domain can be split up in $N+1$ intervals $S_{i}$ of the form $[ x_i, x_{i+1}] $. The approximating function $g: X \rightarrow \mathbf{R}$ is defined as: \begin{align*} g(x) &= \sum_{i=1}^{N} g_{i}(x) \mathbf{I}_{x \in [x_i, x_{i-1}]} \\ &= \sum_{i=1}^{N} \left\lbrace f \left( \dfrac{x_{i} - x_{i-1}}{2} \right ) + f' \left( \dfrac{x_{i} - x_{i-1}}{2} \right) \left[ x - \dfrac{x_{i} - x_{i-1}}{2} \right] \right\rbrace \mathbf{I}_{x \in [x_i, x_{i-1}]} \end{align*} The indicator function $\mathbf{I}_{x \in [x_i, x_{i-1}]}$ activates the first order Taylor approximation of $f$ denoted by $g_i$ for the interval containing $x$. Define the loss function : \begin{equation} \mathcal{L}(x_1, \ldots x_N) = \int \left( f(x) - g(x) \right)^{2} dx \end{equation} My aim is to find the points $x_1, \ldots x_N$ that minimize the loss function. Could somebody kindly educate me in choosing the interval boundaries optimally? I am grateful for any help!
Your problem reminds me of a similar one developed here, in Chapter 4.
It is solved with a graph theory approach, more specifically with a shortest path algorithm that gives you the optimal set of knots.
It is pretty well detailed, take a look!