Permutation equivariant Matrices

91 Views Asked by At

Im trying to find something out about the elements of permutation equivariant matrices. So matrices where if you switch somehting in your input vector the output switches as well.

e.g.

$$ \begin{pmatrix} 3 & 1\\ 1 & 3 \end{pmatrix} \cdot \begin{pmatrix} 2\\ 1 \end{pmatrix} = \begin{pmatrix} 7\\ 5 \end{pmatrix} $$ and $$ \begin{pmatrix} 3 & 1\\ 1 & 3 \end{pmatrix} \cdot \begin{pmatrix} 1\\ 2 \end{pmatrix} = \begin{pmatrix} 5\\ 7 \end{pmatrix} $$

So what i tried was to define it in the follwing way $$ A \cdot \begin{pmatrix} 0 & 1\\ 1 & 0 \end{pmatrix} \cdot \vec{x} = \begin{pmatrix} 0 & 1\\ 1 & 0 \end{pmatrix} \cdot A \cdot \vec{x} $$ for two dimensions. For 3 dimensions it would need to specify 3 equations i think $$ A \cdot \begin{pmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 1 \end{pmatrix} \cdot \vec{x} = \begin{pmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 1 \end{pmatrix} \cdot A \cdot \vec{x}\\ A \cdot \begin{pmatrix} 0 & 0 & 1\\ 0 & 1 & 0\\ 1 & 0 & 0 \end{pmatrix} \cdot \vec{x} = \begin{pmatrix} 0 & 0 & 1\\ 0 & 1 & 0\\ 1 & 0 & 0 \end{pmatrix} \cdot A \cdot \vec{x}\\ A \cdot \begin{pmatrix} 1 & 0 & 0\\ 0 & 0 & 1\\ 0 & 1 & 0 \end{pmatrix} \cdot \vec{x} = \begin{pmatrix} 1 & 0 & 0\\ 0 & 0 & 1\\ 0 & 1 & 0 \end{pmatrix} \cdot A \cdot \vec{x} $$ I tried to use this definition to find something out about the coefficients $a_{i,j}$ of $A$ and see if the need to fullfil some sort of equation or condition. To me it seems like the matrix needs to symmetrical in order for the permutation equivariance to work but i wasn't able do derive it from these defining equations.

Hopefully someone can give me some ideas.

1

There are 1 best solutions below

6
On

You want to compute the set of matrices which commutes with every permutation matrix. Equivalently, you want to compute the endomorphism algebra of $k^n$ regarded as a representation of the symmetric group $S_n$.

This can be done as follows. Over every field of characteristic zero, $k^n$ breaks up as an $S_n$-representation as a direct sum of the trivial representation, spanned by $(1, 1, \dots 1)$, together with an irreducible representation of dimension $n-1$ given by the "orthogonal complement," namely the subspace

$$\{ (x_1, \dots x_n) \in k^n : \sum_{i=1}^n x_i = 0 \}.$$

Write $V$ for this irreducible representation. This gives $\text{End}(k^n) \cong \text{End}(1) \times \text{End}(V)$. We have $\text{End}(1) \cong k$ but $\text{End}(V)$ is only a priori a division algebra (by Schur's lemma). However, $V$ is absolutely irreducible, meaning it is irreducible after passing to the algebraic closure, and this implies (and is equivalent to) $\text{End}(V) \cong k$.

Altogether we get that $\text{End}(k^n) \cong k \times k$. Explicitly, there is a $2$-dimensional vector space of matrices commuting with every permutation matrix, and a basis is given by the identity together with the projection operator from $k^n$ to its trivial subrepresentation, namely the matrix

$$\left[ \begin{array}{cccc} \frac{1}{n} & \frac{1}{n} & \cdots & \frac{1}{n} \\ \frac{1}{n} & \frac{1}{n} & \cdots & \frac{1}{n} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{1}{n} & \frac{1}{n} & \cdots & \frac{1}{n} \end{array} \right].$$

This matrix can be thought of as obtained by averaging together every permutation matrix.

An alternative and maybe more conceptual argument can also be used here involving Hecke operators, which avoids Schur's lemma or needing to know anything about irreducibility.