Conjugation under Permutation Matrices

912 Views Asked by At

This seems like a basic matrix algebra question but I can't seem to find any easy reference/trick for it. If I am given two (diagonal) matrices $D_1$ and $D_2$ and guaranteed that they are related via conjugation of a permutation matrix i.e. $$D_2=P.D_1.P^{-1}$$ can I find what the permutation matrix P is, completely or partially?

2

There are 2 best solutions below

0
On

I follow the notation on the Permutation matrix wiki page.

Let $p_1,\dots,p_n$ denote the diagonal entries of $D_1$, and let $q_1,\dots,q_n$ denote the diagonal entries of $D_2$. Suppose that $\pi$ is a permutation for which $q_i = \pi(p_i)$ for $i = 1,\dots,n$. In this case, we find that $$ D_2 = P_\pi D_1 P_{\pi}^{-1} = P_{\pi}D_1 P_{\pi}^T. $$

0
On

It depends on the multiplicity of the values of the entries. In the extreme case, if they're all equal, any permutation would do (any invertible matrix, actually, beacuse it'd commute with everything). On the complete opposite, if they're all different, then you can determine specifically the matrix in the way that others have already said.

The thing is that if you have a value p1 with multiplicity m1, then you have m1 possible entries on D2 on wich to send the first entrie of D1 with the value p1, m1 - 1 for the second and so on, so you have m1! permutations (of length m1) that do that.

At the end, if the multiplicity of the different values of the entries are m1, m2,..., mt, then there are m1!m2!...mt! different permutations that will work.