Simultaneous reduction to pseudo-permutations

12 Views Asked by At

In my research (theoretical physics) I've come across a mathematical problem that has so far stumped to me, my colleagues and my ability to google things, so I'm interested in seeing if anyone here has an answer.


First, let me make a definition: an $m\times m$ matrix $M$ is an $n$-pseudopermutation for some positive integer $n$ if $M^n=I$ and each row and column of $M$ contains exactly one nonzero element, which is an $n$th root of unity. This generalises permutation matrices, where the roots of unity are limited to 1.

Now, consider a finite set of matrices $A_1,A_2,\ldots,A_k$ who have the property $A_i^n=I$ (the positive integer $n$ is the same for all $i$). The matrices do not necessarily commute, but $(A_{i_1}A_{i_2}\cdots A_{i_\ell})^n=I$ holds for any sequence of indices $\{i_1,i_2,\ldots i_\ell\}$ (this property is of course guaranteed if they do commute).

My question is then this: does there always exist an invertible matrix $P$ such that $P^{-1}A_i P$ is an $n$-pseudopermutation for all $i$? If so, how does one go about finding $P$? (When the matrices commute, this reduces to simultaneous diagonalisation.)



I have successfully done this for some $9\times 9$ matrices that showed up in my research, but it was quite ad hoc and didn't amount to a general method. Still, I'll write an example in case it helps inspire some answers. I'll keep the physics to a minimum since I'm appealing to the higher court of pure mathematics, but if you're interested, the full context can be found in this article.

An $N$-particle scattering process can be described using $N$ momenta $p_i$. This isn't very convenient, though, since the vectors aren't Lorentz invariant and don't make conservation of momentum very manifest. Therefore, it's convenient to define generalized Mandelstam invariants $s_{ijk\cdots} = (p_i+p_j+p_k+\ldots)^2$, which makes them Lorentz invariant, and conservation of momentum (i.e. $\sum_i p_i = 0$) is replaced by linear relations among the invariants, so by choosing a suitable basis on the space they span means you can stop worrying about such things altogether.

For $N=6$, a convenient basis is $\{s_{12},s_{23},s_{34},s_{45},s_{56},s_{61},s_{123},s_{234},s_{345}\}$. It has the added benefit of being closed under cyclic permutations of the set $\{p_1,p_2,\ldots, p_6\}$, i.e. those cyclings act as permutations on this set of invariants. This makes writing down and manipulating any Lorentz-invariant function of the momenta that has this cyclic symmetry very convenient.

However, I also have to work with functions that have other symmetries. One is symmetric under cyclic permutations of $\{p_1,p_2,p_3\}$ and $\{p_4,p_5,p_6\}$, as well as under swapping these two subsets. This is a problem, because the basis is not closed --- for instance, the cycling $\{p_1,p_2,p_3\}\mapsto\{p_2,p_3,p_1\}$ maps $s_{34}\rightarrow s_{14}$ which is not an element of the basis (instead, it is the linear combination $s_{56}-s_{123}-s_{234}+s_{23}$). The expressions become messy and the symmetry becomes hidden.

As a remedy, I seek a new basis that is closed under the different symmetry operations. In this case, the symmetry group is generated by $\{p_1,p_2,p_3\}\mapsto\{p_2,p_3,p_1\}$ and $\{p_1,p_2,p_3,p_4,p_5,p_6\}\mapsto\{p_4,p_5,p_6,p_1,p_2,p_3\}$. On the space of invariants with the old basis, these act as the linear transformations (zero elements omitted) \begin{equation} A_1 = \begin{bmatrix} &1&&&&&&&\\ -1&-1&&&&&1&&\\ &1&&&1&-1&-1&&\\ &&&1&&&&&\\ &&&&1&&&&\\ &&&&&1&&&\\ &&&&&&1&\\ -1&&-1&&1&&&-1&\\ &1&&1&&-1&-1&&\\ \end{bmatrix}\quad\text{and}\quad A_2= \begin{bmatrix} &&&1&&&&&\\ &&&&1&&&&\\ &&&&&1&&&\\ 1&&&&&&&&\\ &1&&&&&&&\\ &&1&&&&&&\\ &&&&&&1&&\\ &&&&&&&1&\\ &&&&&&&&1\\ \end{bmatrix}, \end{equation} respectively. This is an example of the set of matrices described above, and only the latter is a (pseudo)permutation. To find a better basis, I drew a Cayley graph mapping out how these generators transform all invariants of the type $s_{ij}$ or $s_{ijk}$ ($ij(k)$ indicates $s_{ij(k)}$; dashed lines indicate $A_2$ and solid triangles counterclockwise indicate $A_1$):

Then I used the graph to find sets of invariants that transformed into each other under the group (the ones I ended up with are marked with $\star,\circ$ and $\bullet$, respectively). Lastly, I played around with linear combinations taken from the sets until I found one that gave a complete basis. This last step was very trial and error and the main reason for looking for a better way. The result was \begin{equation} \begin{gathered} t_1 = -\frac{s_{36} + s_{14} + s_{25}}{3},\quad t_2 = -\frac{s_{24} + s_{35} + s_{16}}{3},\quad t_3 = -\frac{s_{15} + s_{26} + s_{34}}{3},\\ t_6 = \frac{s_{36} + \omega s_{14} + \omega^2 s_{25}}{3},\quad t_4 = \frac{s_{24} + \omega s_{35} + \omega^2 s_{16}}{3},\quad t_5 = \frac{s_{15} + \omega s_{26} + \omega^2 s_{34}}{3},\\ t_9 = \frac{\omega^2 s_{36} + \omega s_{14} + s_{25}}{3},\quad t_7 = \frac{\omega^2 s_{24} + \omega s_{35} + s_{16}}{3},\quad t_8 = \frac{\omega^2 s_{15} + \omega s_{26} + s_{34}}{3}, \end{gathered} \end{equation} where $\omega = e^{2\pi i/3}$. Defining $P$ as the transformation that takes $\{s_{12},\ldots\}\mapsto\{t_1,\ldots\}$, both $A_1$ and $A_2$ become 3-pseudopermutations: \begin{equation} P^{-1} A_1P = \begin{bmatrix} &1&&&&&&&\\ &&1&&&&&&\\ 1&&&&&&&&\\ &&&&\omega&&&&\\ &&&&&\omega&&&\\ &&&\omega&&&&&\\ &&&&&&&\omega^2&\\ &&&&&&&&\omega^2\\ &&&&&&\omega^2&&\\ \end{bmatrix},\qquad P^{-1} A_2P = \begin{bmatrix} 1&&&&&&&&\\ &&1&&&&&&\\ &1&&&&&&&\\ &&&1&&&&&\\ &&&&&1&&&\\ &&&&1&&&&\\ &&&&&&1&&\\ &&&&&&&&1\\ &&&&&&&1&\\ \end{bmatrix}. \end{equation} I have done the same when the symmetry is cyclings of $\{p_1,p_2\}$ and $\{p_3,\ldots,p_6\}$ and cyclings/swaps of $\{p_1,p_2\},\{p_3,p_4\},\{p_5,p_6\}$, which was sufficient for everything I needed to accomplish, but it bugs me that I couldn't do it more elegantly. With $N=8$ one would be dealing with $20\times 20$ matrices, so the same hand-fiddly approach is not really practical there.