How to interpolate (polynomial) a matrix with matrix multiplication?

176 Views Asked by At

I would like to know how to do polynomial interpolation on a matrix. I have a feeling that it involves the Vandermonde matrix, but that's about it.

I don't want to use a computer algorithm to do it, either. I would like to do it with as much linear algebra as possible.

For example, here is my solution for using linear interpolation using matrix multiplication.

Consider the 2x3 matrix $A=\begin{bmatrix} 1&2&3\\4&5&6 \end{bmatrix}$

If $L=\begin{bmatrix} 1& 0\\ 0.5& 0.5\\ 0& 1 \end{bmatrix}$ and $R=\begin{bmatrix} 1& 0.5& 0& 0& 0\\ 0& 0.5& 1& 0.5& 0\\ 0& 0& 0& 0.5& 1 \end{bmatrix}$

then the linear interpolation of $A_{2x3}$ is

$(LAR)_{3x5}=\begin{bmatrix} 1.0000& 1.5000& 2.0000& 2.5000& 3.0000\\ 2.5000& 3.0000& 3.5000& 4.0000& 4.5000\\ 4.0000& 4.5000& 5.0000& 5.5000& 6.0000 \end{bmatrix}$

This can easily be generalized to any mxn matrix. Its linear interpolation is a (2m-1)x(2n-1) matrix. It's also easy to add multiple interpolated columns/rows. No messy for loops.

How can I accomplish polynomial interpolation using matrix multiplication? (Vandermonde?)

Bonus question: none of the algorithms for linear interpolation I came across use matrix multiplication. I would think that this would be more efficient. I think GPUs are basically design to calculate things this way. Any ideas why?

Thanks :D