Inspired by this question. I am aware of the connection of
- polynomial multiplication and convolution of coefficients.
- convolution and multiplication by circulant matrix.
These two ought to be combined to represent multiplication of polynomials with matrix multiplication (or am I mistaken?) But how could one do the same with a rational function? Would it for some magical reason be convertible to matrix division?
Edit Own attempts so far:
If we instead of circulant matrices which represent circular convolution we work with Toeplitz matrices which represent zero-padded convolution and let $1+\frac{x}{2}$ be represented by:
$$\left[\begin{array}{cccccc} 1&0.5&0&0&0&0\\ 0&1&0.5&0&0&0\\ 0&0&1&0.5&0&0\\ 0&0&0&1&0.5&0\\ 0&0&0&0&1&0.5\\ 0&0&0&0&0&1 \end{array}\right]$$ I hope you can see which coefficients go where. The inverse of the above matrix is: $$\left[\begin{array}{cccccc} 1&-0.5&0.25&-0.125&0.0625&-0.03125\\ 0&1&-0.5&0.25&-0.125&0.0625\\ 0&0&1&-0.5&0.25&-0.125\\ 0&0&0&1&-0.5&0.25\\ 0&0&0&0&1&-0.5\\ 0&0&0&0&0&1 \end{array}\right]$$
Which we can see is a geometric series. The inverse of the polynomial is $\frac{1}{1+x/2}$. If we subtract 1, we get: $\frac{1}{1+x/2}-\frac{1}{1}=\frac{1-1-x/2}{1+x/2}$ which is the inverse of $\frac{1+x/2}{x/2}$
Here we will see a minor inconvenience: The matrix is nilpotent - and can not have an inverse. The same is true for the $x$ polynomial. We can find a pseudo-inverse, but we will lack a last column - the coefficient for the highest power. This we will need to deal with somehow, but let's forget it for a moment and get back to the counter example.
We try to limit ourselves to ${\mathbb R}^{4\times 4}$, with the $p(x)=1+x$ represented according to above, our $\bf A,B$ matrices become:
$${\bf A}=\left[\begin{array}{rrrr|rrrr}1&0&0&0&1&-1&1&-1\\0&1&0&0&0&1&-1&1\\ 0&0&1&0&0&0&1&-1\\0&0&0&1&0&0&0&1\\\hline 0&0&0&0&1&0&0&0\\0&0&0&0&0&1&0&0\\0&0&0&0&0&0&1&0\\0&0&0&0&0&0&0&1\end{array}\right], {\bf B} = \left[\begin{array}{rrrr|rrrr}1&0&0&0&1&0&0&0\\0&1&0&0&0&1&0&0\\0&0&1&0&0&0&1&0\\0&0&0&1&0&0&0&1\\\hline 1&1&0&0&0&0&0&0\\0&1&1&0&0&0&0&0\\0&0&1&1&0&0&0&0\\0&0&0&1&0&0&0&0\end{array}\right]$$
We can verify that it can handle the calculations $\bf AB, BA$:
$${\bf AB} = \left[\begin{array}{rrrr|rrrr}2&0&0&0&1&0&0&0\\0&2&0&0&0&1&0&0\\0&0&2&0&0&0&1&0\\0&0&0&2&0&0&0&1\\\hline 1&1&0&0&0&0&0&0\\0&1&1&0&0&0&0&0\\0&0&1&1&0&0&0&0\\0&0&0&1&0&0&0&0\end{array}\right] \text{ which is indeed } \left[\begin{array}{cc}2&1\\p(x)&0\end{array}\right] \text{ , and }$$ $${\bf BA} = \left[\begin{array}{rrrr|rrrr}1&0&0&0&2&-1&1&-1\\0&1&0&0&0&2&-1&1\\0&0&1&0&0&0&2&-1\\0&0&0&1&0&0&0&2\\\hline 1&1&0&0&1&0&0&0\\0&1&1&0&0&1&0&0\\0&0&1&1&0&0&1&0\\0&0&0&1&0&0&0&1\end{array}\right] \text{which is} \left[\begin{array}{cc}1&1+1/p(x)\\p(x)&1\end{array}\right]$$
But this still does not prove this would work in a general case, or how to identify a rational function from it's representation once we have done a series of operations on it.
EDIT
The above observation with geometric series allows us to write as a polynomial of a matrix.
$${\bf P}^{-1} = \sum_i^{\infty} ({\bf I-P})^i$$
So assuming our known matrix $\bf R$ representing some unidentified rational function can be written like ${\bf R}={\bf P}^{-1}{\bf Q}$, we can try to find $\bf P$ and $\bf Q$ by solving for them as such a matrix polynomial times a matrix. Any ideas of how to do that?