Does there exist 2 matricies, such that they can be used to transpose any n by n matrix?

161 Views Asked by At

Ideally $\exists A, B$ to be able to transpose matrix $X \; \forall X \in M_{n\times n} $ by matrix multiplication. (Even more ideal is if there is only one matrix, $A$ that can transposes $X$ as follows: $X^T = AX$ but I'm ignoring this posibility for now)

So far, I'm guessing that such a matrix doesn't exist and have been trying to prove that. I've tried to prove $\not \exists A,B : \forall X\in M_{n\times n}, \: X^T = AXB $

I've examined a sub-case where $X$ is an orthogonal matrix, but that hasn't gotten me anywhere except $XAXB = I_n$

5

There are 5 best solutions below

5
On

it cannot be done:

When multiplying matrices we have

$$\mathbf N_{jk}=\sum_{u,v}\mathbf A_{ju}\mathbf M_{uv}\mathbf B_{vk}$$

In this case, we're looking for $\mathbf N=\mathbf M^T$, so $\mathbf N_{jk}=\mathbf M_{kj}$.

So for all $j,k$, $A_{jk}B_{jk} = 1$, so there must be no zero entries among either of the two matrices. But for all other pairs of cells from $A$ and $B$ where $u \neq k$ or $v \neq j$, $A_{ju}B_{vk} = 0$... but this cannot happen, because none of the entries in $A$ or $B$ are allowed to be $0$ as above.

2
On

Suppose $X^T = AXB$. Taking $X=I$ gives $B=A^{-1}$. Taking $X=A$ then gives $A^T=A$.

Consider the case $n=2$ and the basis matrices $E_{ij}$, whose entries are all $0$, except that the $(i,j)$ entry, which is $1$.

Taking $X=E_{12}$ gives $a_{11}=0$.

Taking $X=E_{21}$ gives $a_{22}=0$.

But then $AE_{11}A^{-1}=E_{22} \ne E_{11}^T$

So there is no solution for $n=2$ and so no solution for any $n$ because you can consider the principal $2 \times 2$ submatrix of $X$.

0
On

If we stack row by row the $n\times n$ matrices into $n^2\times 1$ vectors, then the OP's question is:

Is the function $f:X\in M_n\rightarrow X^T\in M_n$ (we can present $f$ in the form of a $n^2\times n^2$ matrix) decomposable into a tensor product $A\otimes B^T$ ?

The answer is no and it's not a scoop ! Yet, $f$ can be written in a sum of $n^2$ such applications (perhaps, we can do better).

Proof. Let $E_{i,j}$ be the $n\times n$ matrix s.t. all the entries are $0$ except the $(i,j)$ one which is $1$. The matrix $E_{i,j}\otimes E_{k,l}$ is a $n^2\times n^2$ block-matrix s.t. all blocks are $0$ except the $i,j$ one which is $E_{k,l}$, that is, this matrix has only one non-zero entry (and it's $1$). Conversely, each matrix that has only one non-zero entry (and it's $1$) is decomposable as above.

Note that $f$ is a permutation with square $I_{n^2}$, that is a product of disjoint transpositions. Thus $f$ has exactly $n^2$ non-zero entries, and, consequently, is the sum of $n^2$ such decomposable tensor products.

EDIT. Although my post does not interest anyone, I give below a result concerning decomposable permutations (the proof is not difficult).

$\textbf{Proposition}$. Among the $(n^2)!$ permutations of $ n ^ 2 $ elements, only those of the form $U\otimes V$, where $U,V$ are permutations of $n$ elements are decomposable (thus, only $(n!)^2$ permutations are decomposable).

2
On

Presumably $n>1$. If $X^T=AXB$ for every $n\times n$ matrix $X$, then $vu^T=(Au)(v^TB)$ for every pair of nonzero vectors $u$ and $v$, meaning that for any fixed $u\ne0$, $Au$ is a nonzero multiple of any $v$. This is impossible because $n>1$.

2
On

That follows is much stronger than the required result.

EDIT. $\textbf{Proposition}$. Let $R$ be a ring with unity and $a,b\in R$. Let $\phi:x\in R\rightarrow axb\in R$ be s.t. $\phi(1)=1$ and, for every $x,y\in R$, $\phi(xy)=\phi(y)\phi(x)$.

Then $R$ is a commutative ring.

$\textbf{Proof}$. Note that $\phi(1)=ab=1$.

$\phi(ab)=1=\phi(b)\phi(a)=ab(ba)ab=ba$, that is $ba=1$.

For every $x,y$, $axyb=ay(ba)xb=ayxb$; then $b(axyb)a=b(ayxb)a$, that is $xy=yx$. $\square$