Aloha, my dudes and dudettes.
I have grown interest in a specific type of matrices in this previous question. For context they do occur when expressing differential equations on coefficients for truncated power series.
In short, they are linear combination of diagonal matrices, where each diagonal is a polynomial of row index. But a new polynomial for each diagonal:
$$M_{ij} = P_{i-j}(i)$$
Two questions have popped up which could greatly improve usefulness of alternate representation of these matrices:
- Are they closed under matrix multiplication? To me it seems they are, but I have not been able to prove it.
- The identity matrix is one of the simplest examples in the set and if we consider matrix multplication an operation, what would be left to prove that our set of matrices is a group if we equip it with this operation?
Are you imposing any degree restrictions on the polynomials $P_{i-j}(i)$? If not, then I can answer both your questions:
The reason for both answers is that your set of matrices is actually $M(n, \mathbb{R})$, the set of all $n \times n$ matrices.
Consider any matrix $A$. As you look over the different diagonals, you have at most $n$ points that look like $(i, a_{ij})$, and you want to find a polynomial $P_{i-j}(i)$ that goes through these points. Lagrange interpolation ensures that you can do this with a unique polynomial of degree at most $n-1$. You do this for each diagonal, and then your matrix (and so in fact any matrix) is of this form.
Note that the coefficients of the polynomials will often be much larger in magnitude than the actual entries of your matrix. So there's no real hope of saving space by storing the polynomials instead of the matrix entries unless you know a lot more about other restrictions on the form of your matrix.