Ok. I am aware this might seem a bit... obscure question.
Lately I have grown interested in a family of matrices
$$M_{ij} = f(i,j)$$
Example are the famous convolutional matrices constant along diagonal strips: $$M_{ij} = c_{i-j}$$
which implement convolution with filter having $\mathcal Z$-transform $$c(z) = \sum_{k=-N}^N c_kz^{k}$$
Another example of such matrix is the differentiation operator on a P space
$$M_{ij} = i\cdot \delta(i-j-1)$$
For second order differential:
$${M_{ij}}^2 = (i+1)\cdot i\cdot \delta(i-j-2)$$
Ovviamente we also have for linear combinations of higher order derivatives:
$$\sum_{n} d_n{M_{ij}}^n = \sum_{n} P_n(i)\delta(i-j-n)$$
It is easy to imagine we can save space by storing functions calculating these matrix elements whenever we need them, instead of storing the whole matrices.
For a small example for $D^2-I$ instead of storing the matrix:
$$\begin{bmatrix} -1 &0& 2& 0& 0& 0\\ 0& -1& 0& 6& 0& 0&\\ 0& 0& -1& 0& 12& 0&\\ 0& 0& 0& -1& 0& 20&\\ 0& 0& 0& 0& -1& 0\\ 0& 0& 0& 0& 0& -1\\\end{bmatrix}$$
(or even bigger, could be arbitrarily big!), it would be nicer to store only three polynomials $-1,0,(x+1)x$ for example if we assume they live in $P_2$ space we could store
$$\begin{bmatrix}-1&0&0\\0&0&0\\0&1&1\end{bmatrix}$$ and then multiply with $$ \begin{bmatrix}1\\x\\x^2\end{bmatrix}$$
We can see convolutional matrices described above will be subset with such matrix description of the following form:
$$\begin{bmatrix}c_{-N}&0&0\\\vdots&0&0\\c_N&0&0\end{bmatrix}$$
Only the constant terms non-zero.
Now to my question. Do matrices classified in this way have some name?
I am aware of for example Vandermonde matrices, which are not exactly the same but kind of similar as they also relate to polynomials in a related way.