What is a way to construct a smooth polynomial surface ($\mathbb{R}^2 \rightarrow \mathbb{R}$) with Lagrange-polynomial properties in every partial derivative? I want to try this for image interpolation.
Is there a generalization of the Lagrange polynomial to 3D?
6.3k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
Introduce an auxiliary variable $t$ so that all three variables -- $x$, $y$ and $f$ -- are functions of $t$. You would have zeroes in four dimensional space like this:
t x y f
- - - -
1 3 7 4
2 4 7 3
3 4 4 9
The idea is to leverage Lagrange interpolation to have explicit formula for $f(t)$. However, with tabulation in this naive form, then it would be quite challenging to eliminate this newly introduced variable $t$. (It can be done with Grobner basis).
However, according to [1] almost every linear combination of $x$ and $y$ would also work. In our example let $t = 100 x+y$:
t x y f
--- - - -
307 3 7 4
407 4 7 3
404 4 4 9
Now you have $f(100 x+y)$, which satisfies your partial derivative criteria.
[1] H.J. Stetter. Numerical_Polynomial_Algebra.
On
While ignoring most of the OP's wishes, by far the simplest
interpolation between four neighbouring pixels is the bilinear one:
$$
P(x,y) = a + b\,x + c\,y + d\,xy
$$
Substitute the four neighbouring pixel positions:
$$
P(i,j) = a + b\,i + c\,j + d\,ij \\
P(i+1,j) = a + b\,(i+1) + c\,j + d\,(i+1)j \\
P(i,j+1) = a + b\,i + c\,(j+1) + d\,i(j+1) \\
P(i+1,j+1) = a + b\,(i+1) + c\,(j+1) + d\,(i+1)(j+1)
$$
Solve for the coefficients $(a,b,c,d)$ and substitute. Or make an educated guess.
Then for $i \le x \le i+1$ and $j \le y \le j+1$ we find:
$$
P(x,y) = (x-i)(y-j)P(i+1,j+1) + (x-i)(j+1-y)P(i+1,j) \\
+ (i+1-x)(y-j)P(i,j+1) + (i+1-x)(j+1-y)P(i,j)
$$
I would most probably stick to this if it were my research (but it isn't).
If you fix the value $x$ one variable you have the Lagrange interpolation formula
$$ f(x, y) = \sum_{k=1}^N f(x, y_k)\prod_{i \neq k} \frac{y_i - y}{y_i - y_j} $$
For each fixed value $y = y_k$ you can construct a Lagrange polynomial for the function $f(x, y_k)$.
$$ f(x, y_k) = \sum_{k=1}^N f(x_k, y_k)\prod_{i \neq k} \frac{x_i - x}{x_i - x_j} $$
The result is expansion in some kind of bilinear Lagrange basis.
$$ \prod_{i \neq k} \frac{y_i - y}{y_i - y_j} \prod_{i \neq k} \frac{x_i - x}{x_i - x_j} = \delta(i = k)\delta(i = k)$$
We have constructed one of many possible B-splines.
In theory the Weistestrass approximation theorem says continuous functions on the interval can be uniformly approximated by polynomials. Using the argument above it makes sense that 2-variable polynomials are dense in continuous functions on the square. This is a special case of the Stone-Weierstrass Theorem:
$$ \overline{ \mathbb{R}[x,y]} = C^0\big([0,1]^2\big) $$
Using linear interpolation we can always draw some kind of surface connecting your control points and having the value specified, $(x_k, y_k, f(x_k,y_k))$. The Stone Weierstrass theorem says it can always be a polynomial and the difference between your approximation $f_1(x,y) = \sum a x^m y^n$ to your original function $f$ can be made arbitrarily small (as small as you want).
$$ \sup_{(x,y)\in [0,1]^2} |f_1(x,y) - f(x,y)| < \epsilon $$
It practice it would be really great to know the degree of the polynomials you find and how to build them (find the coefficients). Splines let you do just that.