Can you completely permute the elements of a matrix by applying permutation matrices?

7.2k Views Asked by At

Suppose I have a $n\times n$ matrix $A$. Can I, by using only pre- and post-multiplication by permutation matrices, permute all the elements of $A$? That is, there should be no binding conditions, like $a_{11}$ will always be to the left of $a_{n1}$, etc.

This seems to be intuitively obvious. What I think is that I can write the matrix as an $n^2$-dimensional vector, then I can permute all entries by multiplying by a suitable permutation matrix, and then re-form a matrix with the permuted vector.

4

There are 4 best solutions below

4
On BEST ANSWER

It is not generally possible to do so.

For a concrete example, we know that there can exist no permutation matrices $P,Q$ such that $$ P\pmatrix{1&2\\2&1}Q = \pmatrix{2&1\\2&1} $$ If such a $P$ and $Q$ existed, then both matrices would necessarily have the same rank.

7
On

Some users of MSE are very sensitive to the word "obvious", but I believe it is blatantly obvious that the answer to your question is "no" in general. The reason is simple: by left (right) multiplying $A$ by a permutation matrix, you are permuting each row (column) of $A$ as a whole. Therefore, entries on the same row (column) of $A$ will still align in a row (column). You cannot break row or column alignment by applying left and/or right permutations to $A$.

From another perspective, if you vectorise $PAQ$, it becomes $\operatorname{vec}(PAQ)=(Q^T\otimes P)\operatorname{vec}(A)$. While the Kronecker product $Q^T\otimes P$ is an $n^2\times n^2$ permutation matrix, it is clear that not all $n^2\times n^2$ permutation matrices are decomposable tensors.

5
On

Let me add one more argument:

For $n \ge 2$:

Suppose the entries in the $n \times n$ matrix $A$ are all distinct. Then there are $(n^2)!$ distinct permutations of $A$.

There are $n!$ row-permutations of $A$ (generated by premultiplication by various permutation matrices), and $n!$ col-permutations of $A$ (generated by post-multiplication by permutation matrices). If we consider all expressions of the form $$ RAC $$ where $R$ and $C$ each range independently over all $n!$ permutation matrices, we get at most $(n!)^2$ possible results. But for $n > 1$, we have \begin{align} (n!)^2 &= [ n \cdot (n-1) \cdots 2 \cdot 1 ] [ n \cdot (n-1) \cdots 2 \cdot 1 ] \\ &< [ 2n \cdot (2n-1) \cdots (n+2) \cdot (n+1) ] [ n \cdot (n-1) \cdots 2 \cdot 1 ] \\ &= (2n)! \\ &\le (n^2)! \end{align} because $2n \le n^2$ for $n \ge 2$, and factorial is an increasing function on the positive integers. So the number of possible results of applying row- and col-permutations to $A$ is smaller than the number of possible permutations of the elements of $A$. Hence there's some permutation of $A$ that does not appear in our list of all $RAC$ matrices.

BTW, just to close this out: for $1 \times 1$ matrices, the answer is "yes, all permutations can in fact be realized by row and column permutations." I suspect you knew that. :)

PS: Following the comment by @Jack M, I want to make clear why it's OK to consider only things of the form $RAC$. Why do we do the column permutations first, and then the rows? (Or vice-versa, if you read things the other way). What if interleaving row and column permutations does something funky? The answer is that if you do a bunch of row ops interleaved with a bunch of column ops, you get the same thing as if you do all the row-ops first, and then all the column ops afterwards (although the row ops have to come in the same order in this rearrangement, and similarly for the column ops). That requires a little proof, but nothing hairy.

What if we do more than one row-permutation, say, two of them? Don't we have to look at $R_1 R_2 A C$ instead? Answer: $R_1 R_2$ will again be a permutation matrix, so we can really consider this as being $(R_1 R_2) A C$, i.e. the matrix I've called $R$ is the product of any number of permutation matrices. And it will always be a matrix with one $1$ in each row and each column. So my counting of possible row-permutations is still valid.

4
On

Given two elements $a_1$ and $a_2$, the properties "$a_1$ and $a_2$ are on different rows" and "$a_1$ and $a_2$ are on different columns" are preserved by any permutation. Proof:

A column permutation won't affect what row anything is on. A row permutation has to send an entire row to the same row, so if they start on the same row, they end on the same row. Permutations are invertible, so if they can't take two elements on the same row to different rows, they can't take elements on different rows to the same row.

An analogous argument holds for being on the same or different columns.

Thus, a row and column permutation is completely characterized by what it does to a diagonal; to find out where it sends an arbitrary element, just take the row that its row was sent to, and the column its column was sent to.