Is there any tool to do curve fitting with derivative values? I.e. I have a bunch of values of the function at certain points, a bunch of values of the function's derivative at certain points, a bunch of values of the function's second derivative at certain points, and I want to find the simplest function that obeys these constraints.
Curve fitting with derivatives
5.7k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 4 best solutions below
On
Have a look at Hermite basis polynomials (that must not be confused with the "classical" Hermite polynomials): they can be substituted to Bernstein basis polynomials, classical for Bezier curves.
On
Suppose you have $n+1$ constraints on function values and derivatives. You should be able to satisfy these constraints with a polynomial of degree $n$, since this will have $n+1$ coefficients $a_0, \ldots, a_n$. Each constraint will give you a linear equation involving $a_0, \ldots, a_n$. Pass these equations to your favorite linear solver, and you will (usually) get a solution. I say "usually" because sometimes the linear system will not be solvable. The conditions for solvability are complicated. If you want to understand them, look up the topic of "Hermite interpolation".
On
Problem statement
Start with a polynomial and its derivatives: $$ \begin{align} y(x) &= a_{0} + a_{1} x + a_{2} x^{2} + a_{3} x^{3}, \\ y'(x) &= a_{1} x + 2 a_{2} x + 3 a_{3} x^{2}, \\ y''(x) &= 2 a_{2} + 6 a_{3} x. \end{align} $$ (Notice that we are an affine transformation from any other set of polynomial functions, such as those of Legendre.) The fit is looking for the best set of $n=4$ parameters $a$.
There are a sequence of measurements: one set measures functions values, another derivatives, the last, the second derivative. $$ \left\{ x_{1,k}, y_{k} \right\}_{k=1}^{\mu_{1}}, \quad \left\{ x_{2,k}, y'_{k} \right\}_{k=1}^{\mu_{2}}, \quad \left\{ x_{3,k}, y''_{k} \right\}_{k=1}^{\mu_{3}}. $$ This general method accounts for different types of measurements (function value, first derivative, second derivative) at different locations $x$.
Construct linear system
Your conditions lead to a block structure for the system matrix $\mathbf{A}$. The problems have different measurement locations $x$ and measurements $y$, but they share the amplitudes. $$ \begin{align} \mathbf{A} a &= y \\ % \left[ \begin{array}{cccc} % 1 & x_{1,1} & x_{1,1}^{2} & x_{1,1}^{3} \\ 1 & x_{1,2} & x_{1,2}^{2} & x_{1,2}^{3} \\ \vdots & \vdots & \vdots & \vdots \\ 1 & x_{1,\mu_{1}} & x_{1,\mu_{1}}^{2} & x_{1,\mu_{1}}^{3} \\\hline % 0 & 1 & 2 x_{2,1} & 3x_{2,1}^{2} \\ 0 & 1 & 2 x_{2,2} & 3x_{2,2}^{2} \\ \vdots & \vdots & \vdots & \vdots \\ 0 & 1 & 2 x_{2,\mu_{2}} & 3x_{2,\mu_{2}}^{2} \\\hline % 0 & 0 & 2 & 6x_{3,1} \\ 0 & 0 & 2 & 6x_{3,2} \\ \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & 2 & 6x_{3,\mu_{3}} % \end{array} \right] % \left[ \begin{array}{c} a_{0} \\ a_{1} \\ a_{2} \\ a_{3} \end{array} \right] % & = % \left[ \begin{array}{c} y_{1} \\ y_{2} \\ \vdots \\ y_{\mu_{1}} \\\hline y'_{1} \\ y'_{2} \\ \vdots \\ y'_{\mu_{2}} \\\hline y''_{1} \\ y''_{2} \\ \vdots \\ y''_{\mu_{3}} \end{array} \right] % \end{align} $$
The data vector has $m= \mu_{1} + \mu_{2} + \mu_{3}$ rows, and the solution vector is of length $n=4$. In general, the problem will have full column rank $\rho = n$. The dimensions are $$ \mathbf{A} \in \mathbb{C}^{m \times n}_{\rho}, \quad y \in \mathbb{C}^{n}, \quad a \in \mathbb{C}^{m} $$
Least squares solution
The least squares minimizers are the defined as $$ a_{LS} = \left\{ a \in \mathbb{C}^{n} \colon \lVert \mathbf{A} a - y \rVert_{2}^{2} \text{ is minimized} \right\} $$ A rich toolkit offers many paths to solution. One method is the normal equations $$ \mathbf{A}^{*} \mathbf{A} a = \mathbf{A}^{*} y $$ which has the solution $$ a = \left( \mathbf{A}^{*} \mathbf{A} \right)^{-1} \mathbf{A}^{*} y. $$
Example
Start with ideal data and add random noise. The solution vector $$ % a = % \left[ \begin{array}{r} a_{0} \\ a_{1} \\ a_{2} \\ a_{3} \end{array} \right] % = % \left[ \begin{array}{r} 1 \\ -2 \\ 3 \\ 4 \end{array} \right]. % $$ The data sets are $$ % x_{1} = % \frac{1}{10} \left[ \begin{array}{r} 1 \\ 2 \\ 3 \\ 4 \\ 5 \\ 6 \\ 7 \\ 8 \\ 9 \\ 10 \end{array} \right], \, % y = % \frac{1}{500} \left[ \begin{array}{r} 417 \\ 376 \\ 389 \\ 468 \\ 625 \\ 872 \\ 1221 \\ 1684 \\ 2273 \\ 3000 \end{array} \right], \quad %%% x_{2} = % \frac{1}{6} \left[ \begin{array}{r} 1 \\ 2 \\ 3 \\ 4 \\ 5 \\ 6 \end{array} \right], \, % y' = % \frac{1}{3} \left[ \begin{array}{r} -2 \\ 4 \\ 12 \\ 22 \\ 34 \\ 48 \end{array} \right], \quad %%% x_{3} = % \frac{1}{5} \left[ \begin{array}{r} 1 \\ 2 \\ 3 \\ 4 \\ 5 \end{array} \right], \, % y'' = % \left[ \begin{array}{r} 10 \\ 14 \\ 18 \\ 22 \\ 26 \\ 30 \end{array} \right] $$ Before perturbation, the linear system looks like this: $$ \left[ \begin{array}{rrrr} 1 & 0.1 & 0.01 & 0.001 \\ 1 & 0.2 & 0.04 & 0.008 \\ 1 & 0.3 & 0.09 & 0.027 \\ 1 & 0.4 & 0.16 & 0.064 \\ 1 & 0.5 & 0.25 & 0.125 \\ 1 & 0.6 & 0.36 & 0.216 \\ 1 & 0.7 & 0.49 & 0.343 \\ 1 & 0.8 & 0.64 & 0.512 \\ 1 & 0.9 & 0.81 & 0.729 \\ 1 & 1 & 1 & 1. \\\hline 0 & 1 & 0.333333 & 0.0833333 \\ 0 & 1 & 0.666667 & 0.333333 \\ 0 & 1 & 1 & 0.75 \\ 0 & 1 & 1.33333 & 1.33333 \\ 0 & 1 & 1.66667 & 2.08333 \\ 0 & 1 & 2 & 3 \\\hline 0 & 0 & 2 & 1.2 \\ 0 & 0 & 2 & 2.4 \\ 0 & 0 & 2 & 3.6 \\ 0 & 0 & 2 & 4.8 \\ 0 & 0 & 3 & 6 \end{array} \right] % \left[ \begin{array}{r} a_{0} \\ a_{1} \\ a_{2} \\ a_{3} \end{array} \right] % = % \frac{1}{1500} \left[ \begin{array}{r} 1251 \\ 1128 \\ 1167 \\ 1404 \\ 1875 \\ 2616 \\ 3663 \\ 5052 \\ 6819 \\ 9000 \\\hline -1000 \\ 2000 \\ 6000 \\ 11000 \\ 17000 \\ 24000 \\\hline 16200 \\ 23400 \\30600 \\ 37800 \\ 45000 \end{array} \right] % $$ A random error $-0.1 < \epsilon < 0.1$ was added to each $y$ value.
The normal equations are $$ % \frac{1}{1800000} \left[ \begin{array}{rrrr} 18000000 & 9900000 & 6930000 & 5445000 \\ 9900000 & 17730000 & 18045000 & 18209940 \\ 6930000 & 18045000 & 58759940 & 90824850 \\ 5445000 & 18209940 & 90824850 & 174558629 \\ \end{array} \right] % a = % \left[ \begin{array}{r} 23.2848 \\ 56.2244 \\ 283.846 \\ 522.72 \end{array} \right] % $$ The perturbed solution is $$ a_{LS} = \left( \mathbf{A}^{*} \mathbf{A} \right)^{-1} \mathbf{A}^{*} y = \left[ \begin{array}{r} 1.10766 \\ -2.09876 \\ 3.02645 \\ 3.99983 \end{array} \right] $$
If you're asking "can you find a function given its derivative and an initial value?" then the answer is yes (presuming the derivative is integrable), and this is precisely the fundamental theorem of calculus.
If you have only some values of the derivative, and an initial value of the function, then you can estimate the shape of the original curve, and this is Euler's method.