Can we use tensors to write multi-variable linear recurrences?

69 Views Asked by At

We can rewrite the linear recurrence on a sequence $(x_n)_{n\in\Bbb N}$ $$x_n = a_{n-1}x_{n-1}+a_{n-2}x_{n-2}+\dots +a_{n-m-1}x_{n-m-1}+a_{n-m}x_{n-m}$$ as the matrix operation $$ \begin{bmatrix} a_{n-1} & a_{n-2} & \dots & a_{n-m-1} & a_{n-m}\\ 1 & 0 & \dots & 0 & 0\\ 0 & 1 & \dots & 0 & 0\\ \vdots & \vdots & \ddots & \vdots & \vdots\\ 0 & 0 & \dots & 1 & 0\\ \end{bmatrix} \begin{bmatrix} x_{n-1}\\ x_{n-2}\\ \dots\\ x_{n-m-1}\\ x_{n-m} \end{bmatrix} = \begin{bmatrix} x_{n}\\ x_{n-1}\\ \dots\\ x_{n-m}\\ x_{n-m+1} \end{bmatrix} $$

This is interesting because we can calculate the terms of our sequence by repeated matrix multiplications (that is, by exponentiating a matrix), which is very fast in certain algorithms.

However, if your linear recurrence has two indices as, for instance $$x_{m,n} = x_{m,n-1}+x_{m-1,n}$$ there doesn't seem to be a way to write it as a matrix application. This makes sense, since we increased the dimension of our sequence: we no longer can write a vector with the first terms. It makes more sense to write them in a matrix, as

$$\begin{bmatrix} x_{m, n} & x_{m-1, n} & \dots & x_{m-c, n}\\ x_{m, n-1} & x_{m-1, n-1} & \dots & x_{m-c, n-1}\\ \vdots & \vdots & \ddots & \vdots\\ x_{m, n-r} & x_{m-1, n-r} & \dots & x_{m-c, n-r}\\ \end{bmatrix}$$

Applying another matrix to it is no good, since it could only combine terms of the same row, and I wish to combine diagonal terms as well. Can't a greater-dimensional matrix do the job?

I know a generalization of matrices for greater dimensions exists and is called tensors. I also know we are supposed to be able to multiply those (and hopefully exponentiate it). However, whenever I had contact with this topic, its product is presented in a completely abstract way, so I don't really know how it would work with actual numbers.

In my research I've encountered this post, but even here the product is interpreted abstractly or a definition for it is ''invented'' and doesn't seem to have any nice proprieties. The lack of a practical answer makes me wonder if this product even exist besides its formal definition.

1

There are 1 best solutions below

0
On

Expanding on JeanMarie's excellent suggestion, consider how easily generating functions can solve your example:

$$ (1)\qquad a_{m,n}= a_{m-1,n}+ a_{m,n-1}$$ valid for $(m,n) \geq (0,0)$

Let $A(x,y)= \sum_{m,n} a_{m,n} x^m y^n$ be its generating function. Define $ f(x)= \sum_m a_{m,0} x^m$ and $ g(y)=\sum_{n} a_{0,n} y^n .$

(These two sums share a common initial term $c=a_{0,0}$ that needs special attention below because of overcounting.)

After multiplying (1) by symbols $x^{m-1} y^{n-1}$ and summing you obtain $A(x,y) - f(x) - g(y) + c = yA(x,y)+ x A(x,y)$ hence $ A(x,y)( 1- x -y)= f(x)+ g(y) -c$.

Thus you can solve for $A(x,y)= \frac{ f(x)+ g(y) -c}{1- x-y}$.

Note that $f$ and $g$ encode initial data for the recurrence equation, located on the two positive axes.

The values of the coefficients of $A(x,y)$ can be found in many ways, one of which makes use of the geometric series expansion $\frac{1}{1-(x+y)} = 1 + (x+y) + (x+y)^2+ \ldots$. In many cases such expansions can be coded with numerical efficiency by exploiting the connection between convolutions and Fourier series.

The upshot is that multivariate polynomials are efficient tools for encoding/solving the recurrence relation.

Finally, to answer your question in the affirmative. There is a connection with tensors, in that these multivariate polynomials can be regarded as elements of a space of symmetric tensor products