I am interested in finding the null-space of the fix point equation:
$$ Pw = w \\ (P_{out} \otimes P_{in}^T)w = w $$
Where $P \in \mathcal{R}^{nm \times nm}$ is the Kronecker product of two permutation/reflection matrices $P_{in} \in \mathcal{R}^{n\times n}, P_{out} \in \mathcal{R}^{m\times m}$ (with a single $\pm 1$ per column). Such an equation is also equivalent to: $P_{out}W = WP_{in}$, where $w$ is the vectorization of matrix $W \in \mathcal{R}^{m \times n}$.
The problem with the introduction of the Kronecker product is that the dimensionality of $P$ easily scales beyond any tractable algorithm for eigendecomposition (I am interested in the nullspace $Q$ of the equation so I can parameterize $W$ to comply with the fix point equation):
$$ (P - I) w = 0 \\ U \left[\begin{array}{ll} \Sigma & 0 \\ 0 & 0 \end{array}\right]\left[\begin{array}{l} B^{\top} \\ Q^{\top} \end{array}\right] w = 0 $$
Is there a possibility to compute the eigendecomposition of $(P - I)$ as a function of the eigendecomposition of the smaller and more tractable matrices $P_{in}$, $P_{out}$?.
More General Problem: Things get a lot more challenging when there are more than a single symmetry constraints: $(^iP,\,\dots,\,^lP)= ((^iP_{in},\, ^iP_{out}),\dots, (^lP_{in}, \,^lP_{out}))$, and the fixed point equation turns to:
$$ \begin{bmatrix} \,^iP - I \\ \vdots \\ \,^lP - I \end{bmatrix} w = 0 \\ $$
$P_{out},P_{in}$ are orthogonal matrices and therefore diagonalizable (over $\Bbb C$) with eigenvalues of magnitude $1$. It follows that the solution space (as a subspace of $\Bbb C^n$) is spanned by the vectors of the form $x \otimes y$ where there exists a $\lambda \in \Bbb C$ (with $|\lambda| = 1$) such that $x,y \in \Bbb C^n$ satsisfy $$ P_{out} x = \lambda x, \quad P_{in}^T y = \bar \lambda y. $$ Because $P = P_{out} \otimes P_{in}^T$ is a real matrix, the solutions space to $Pw = w$ in $\Bbb R^n$ is spanned by the real and imaginary parts of the solutions over $\Bbb C^n$.
Suppose that $P$ is a generalized permutation matrix with $\pm 1$ non-zero entries. We can find the eigenvectors of $P$ associated with $1$ using the following steps.
In particular, if $$ P=\pmatrix{0&p_1&0&0&\cdots&0\\0&0&p_2&0&\cdots&0\\0&0&0&p_3&\cdots&0\\ 0&0&0&0&\cdots&0\\ \vdots&\vdots&\vdots&\vdots&\ddots&\vdots\\ p_n&0&0&0&\cdots&0}, $$ and $Pw = w$ for the column-vector $w \in \Bbb R^n$, then we have $$ (p_1w_2,p_2w_3,\dots,p_{n-1}w_n,p_nw_1) = (w_1,w_2,\dots,w_n). $$ Without loss of generality, set $w_n = 1$. From the equation above, we have $$ w_i = p_i w_{i+1}, \quad i=1,\dots,n-1. $$ Finally, an eigenvector associated with $1$ will exist for this matrix iff the final equation, $w_n = p_nw_1 \implies 1 = p_1p_2\cdots p_n$ holds. If this holds, then the eigenvector $w$ is given by $$ w = (p_1\cdots p_{n-1},p_2 \cdots p_{n-1},\cdots,p_{n-1},1). $$
Here's a function that computes a basis for the solution space to $Pw = w$ given the $\pm 1$ permutation matrix $P$.
For example,
results in
The columns of this matrix form the basis of the solution space.
The script can easily be adapted to handle generalized permutation matrices (i.e. matrices like these except that the non-zero entries can take any value).
If $n$ is very large, one may consider optimizing the computation of the accumulated product. The computation of the cyclic decomposition can also be made faster if the min-element extraction is replaced with the
set.pop()method.