What is the general descriptor of this class of numerical interpolation schemes?

34 Views Asked by At

I have been working on developing some n-dimensional interpolation code for a project of mine, and have developed something that works well. But I am wondering how it would be categorized according to the current theory in the area of numerical interpolation. I am looking for some things to potentially reference in my documentation.

The algorithm is basically a general implementation of Bilinear, Bicubic, Trilinear, Tricubic, (etc.) interpolation in the sense that a 1D interpolation is performed sequentially in each of the n dimensions. It allows for arbitrary dimensional input, and for any 1D interpolation type to apply. Gradients are also computed of the overall interpolation, if they are available from the 1D interpolator used. I developed it because no such general scheme seems to exist in the Python scientific computing ecosystem.

Has there been any theory written on this general class of high-order interpolants performed by dimensional separation (e.g. Bilinear, Bicubic, Trilinear, Tricubic)? I can find information about them individually, but nothing that seems to discuss the core idea as a whole.

1

There are 1 best solutions below

0
On

It's called tensor product interpolation.