Matrix power eventually reaches partial order

96 Views Asked by At

Given an $n$ by $n$ matrix $A$ of non-negative integers and a column matrix $x$ of non-negative integers and of length $n$, and a partial order on the set $\{1, 2, 3, 4,\ldots, n \}$ (representing the indices of $x$), can we determine whether there exists some non-negative integer $k$ such that the elements of $S(k)=A^kx$ obey the partial order (i.e. find some program to do it for us)? I'm not particularly interested in the value of $k$, only whether it exists.

Warning: All of my research and work (and thoughts) are below. Read at your own risk. I found this paper which finds a closed form solution for $S(k)$ for diagonalizable $A$, and it is an expression in terms of the powers of the eigenvalues, like $2^k+4*3^k$. If we compare any two of the entries in $S(k)$, there will be some dominating power in the inequality. We then know that there exists some $k_0$ such that the inequality is uniform on $[k_0, \infty)$, once that term dominates. We can find $k_0$ by dropping and increasing various terms while maintaining the inequalities, reducing to an expression like $k_0 \le \frac{\log w - \log x}{\log y - \log z}$, where $w, x$ are the coefficients of the two sides and $y, z$ are the eigenvalues. Then we can check up to $k_0$ by brute force to find all intervals on which the inequality holds, including the infinite interval if the inequality holds on it. We can do this for all inequalities in the partial order, and find the intersections of their intervals. If the intersection equals the empty set, there is no such $k$, otherwise every $k$ in the intersection is a solution.

(I'm assuming positive eigenvalues. Negative ones can be trivially dealt with by splitting each inequality into even and odd cases for $k$, then comparing all the evens then all the odds, then finding the union of those two results)

There are a few problems with this:

  1. Non-diagonalizable matrices may not be amenable to this method.
  2. In theory, we can round the eigenvalues up or down to maintain the inequalities, and simply get a larger $k_0$ as a result, saving us from having to compute the real values, but the closer two eigenvalues are the less we can round them while keeping them distinct, and as seen in the equation for $k_0$, close eigenvalues can lead to very large values. How can we compute the eigenvalues to exactly the precision we need (since there is no fixed precision we can always use)?

Is there any algorithm I'm missing that allows for this sort of calculation, without incorrect answers to the decision problem due to numerical errors? Is there some minimum bound for the distance between distinct eigenvalues, in terms of the values in the matrix?

Also, let me know whether I would be better off moving this to MathOverflow/how I could ask this question better.