How can I theoretically estimate the interpolation accuracy described below?

18 Views Asked by At

Suppose there are two functions with no analytical form but both have every $(x,y)$ values.

Function 1 $f_1 : \mathbf{R} \to \mathbf{R}$

Function 2 $f_2: \mathbf{R} \to \mathbf{R}$

There is a deterministic function $g$ with no error $g: \mathbf{R} \to \mathbf{R}$ which satisfies the relation $f_1 = f_2 \cdot g$ (point wise multiplication), and $f_1$ has the same points with $f_2$ and $g$.

If $f_1$ can vary from about $0$ to $200$, while $f_2$ can only vary from $-2$ to $1$.

Suppose we need to do some interpolations to these two functions $f_1$ and $f_2$, and sometimes with more points and sometimes with less points.

It seems like that $f_2$ has less variations because of the shorter interval $[-2,1]$ than $f_1$'s $[0,200]$. Can I reach the conclusion that the function $f_2$ may show more accuracy or less error in the interpolation? If so, where can I find the theoretical proof?

If we use the relative error here, then which function can theoretically has more accuracy or less relative error in interpolation?

Thank you!