find derivatives (hermite interpolation) via optimization: ensure uniqueness of optimization problem

20 Views Asked by At

Given a set of interpolation points $\boldsymbol x = \{x_1,...,x_n\}$ and interpolation values $y = \{y_1,...,y_n\}$. On interval $[x_i, x_{i+1}]$, we can write a C1 cubic hermite polynomial $$ p_i(t_i(x)) = y_i H_0(t_i(x)) + y_{i+1} H_1(t_i(x)) + h_i y'_iH_2(t_i(x)) + h_{i} y'_{i+1}H_3(t_i(x)) $$ where $y_i, y_{i+1}$ and are the interpolation values and $y'_{i}, y'_{i+1}$ are the unknown derivatives at the breaks. The basis functions are given by $$ H_0(t) = 1 -3t^2 + 2t^3 \\ H_1(t) = 3t^2 - 2t^3 \\ H_2(t) = t - 2t^2 + t^3 \\ H_3(t) = -t^2 + t^3 $$ and $ t_i(x) = \frac{x-x_i}{x_{i+1}-x_i} = \frac{x-x_i}{h_i} $ is the mapping from real coordinates to the unit interval $t \in [0,1]$.

I set up an optimization problem where the unknown parameters are the derivatives at the breaks, $\boldsymbol d = [y'_1, ..., y'_n]$. In my objective function, I want to minimize the summed jumps in the second derivatives at the interior breaks, that is, $$ \min_{\boldsymbol d} \sum\limits_{i=2}^{n-1} \left[ p''_i(0) - p''_{i-1}(1) \right]^2 $$ under the constraint the the second derivative of the interpolant is positive on the interpolation interval $[x_1,x_n]$. I impose these constraints at $m$ discrete points. Say $\boldsymbol z \in \mathbb{R}^m $ stores $m$ equally distributed points between $x_1$ and $x_n$. Then, there are $m$ linear constraints $p''_j(t_j(z_i)) >= 0 $ where the index $j$ describes the polynomial segment in which the point $z_i$ lives.

Clearly, this problem with these constraints makes only sense if the "shape" of the interpolation values $y_i$ is somehow close to convex. This is something I want to assume. My motivation for this problem was the observation that traditional C2 cubic spline interpolation results in negative second derivatives although the "shape" of the interpolation values is convex. Positive second derivatives are, however, crucial for my application.

My overall goal is to compute sensitivities, i.e., the derivatives of the interpolant w.r.t the interpolation values $y_i$. To do this, the optimization problem must have a unique solution. Otherwise, a sensitivity calculation via numerical differentiation is non-nense if I am not mistaken.

So does this optimization has a unique solution?

If not, how can I rewrite/modify it such that it has a unique solution to eventually compute sensitivities? As I said, my objective is to produce an (at least C1) interpolating spline which has positive second derivatives and allows me to calculate sensitivities. If you can think of better approaches to solve this problem, I appreciate your opinion!