Transposing a matrix using matrix multiplication

477 Views Asked by At

I came across an interesting problem in a linear algebra problem book. This is the very first paragraph in the problem book, it deals with basic operations with matrices: multiplication by a number, addition, matrix multiplication, transposition. I solved all the previous problems without much trouble, but I can't do anything with this problem.

Problem. Let $X$ be a matrix of a general form. Is it always possible to find matrices $A$ and $B$ such that $AXB=X^T$?

I used two approaches:

  1. Using a general formula for matrix multiplication. This is a complete nightmare. Double sums come out - a dead end.

  2. Using some specific matrices as $X$. Matrix units seem to be a good option, they are very easy to multiply. But for any matrix unit there will always be $A$ and $B$ such that $AXB=X^T$. Dead end again.

I've seen the questions Is it possible to transpose a square matrix by multiplication? and Can you transpose a matrix using matrix multiplication? but the answers there are not really relevant to my question, I think.

3

There are 3 best solutions below

5
On BEST ANSWER

The question is given a real $X$, are there $A,B$ such that $AXB= X^T$?

If $X$ is square then https://math.stackexchange.com/a/94617/27978 shows that $X,X^T$ are similar. This is the tricky part, the rest is straightforward.

If $AXB = X^T$ then $B^T X^T A^T = X$, hence we can assume that $n>m$ (more rows than columns).

Using elementary row operations, we may write $PX = \begin{bmatrix} U \\ 0 \end{bmatrix}$ for some invertible $P$ and $U$ is square & upper triangular. There is some $V$ such that $V U V^{-1} = U^T$.

Then $\begin{bmatrix} V & 0 \end{bmatrix} PX \begin{bmatrix} V^{-1} & 0\end{bmatrix} = \begin{bmatrix} V & 0 \end{bmatrix} \begin{bmatrix} U \\ 0 \end{bmatrix} \begin{bmatrix} V^{-1} & 0\end{bmatrix} = \begin{bmatrix} U^T & 0 \end{bmatrix} = X^T P^T$.

Choosing $A=\begin{bmatrix} V & 0 \end{bmatrix} P$ and $B=\begin{bmatrix} V^{-1} & 0\end{bmatrix} P^{-T}$ gives the desired result.


Original answer (I thought the question was to determine if $A,B$ exist so that $AXB = X^T$ for all $X$.)

It is vacuously true for $1 \times 1$ matrices :-). So suppose we are dealing with $n \times m$ where either $m$ or $n$ are greater than one.

Choose $X= e_i e_j^T$. If we compute the $ab$ entry of $Ae_i e_j^TB=e_j e_i^T$ we get $A_{ai} B_{jb} = \delta_{aj} \delta_{bi}$.

Choosing $a=j, b=i$ gives $A_{ji} B_{ji} = 1$ and so all elements of $A,B$ are non zero. Now suppose either $a \neq j$ or $b \neq i$ (at least one of these possibilities exists) we get $A_{ai} B_{jb} = 0$, a contradiction.

4
On

If $X$ is a matrix over a field, then using elementary row- and column operations, $X$ can be diagonalized. That is, there exist invertible square matrices $L$ and $R$ such that $$L X R = D.$$ Now proceed by showing that a diagonal maxtrix can be transposed as desired. Finally reverse the row- and column operations (switching the roles of row and column) to obtain the transpose of $X$ itself.

0
On

$\def\eqdef{\stackrel{\text{def}}{=}}$ Here are details of the procedure described by WimC in his answer. When he refers to $\ X\ $ being "diagonalized", I believe he was referring to obtaining a matrix of the form of the right hand side of the first equation below, not necessarily square.

Let $\ X\ $ be an $\ n\times m\ $ matrix of rank $\ r\,(\le\min(n,m))\ $. Then, by performing reduced row reduction on $\ X\ $ and permuting the columns of its row reduced form, we can find an invertible $\ n\times n\ $ matrix $\ E\ $ and an $\ m\times m\ $ permutation matrix $\ P\ $ such that $$ EXP=\pmatrix{I_{r\times r}&0_{r\times(m-r)}\\ 0_{(n-r)\times r}&0_{(n-r)\times(m-r)}}\ , $$ whence $$ X^T=P\pmatrix{I_{r\times r}&0_{r\times(n-r)}\\ 0_{(m-r)\times r}&0_{(m-r)\times(n-r)}}\big(E^T)^{-1}\ ,\hspace{1em}\tag{1}\label{eq1} $$ since $\ P^T=P^{-1}\ $. But, putting $\ \nu\eqdef n-r, \mu\eqdef m-r\ $ for brevity, \begin{align} &\pmatrix{I_{r\times r}&0_{r\times\nu}\\ 0_{\mu\times r}&0_{\mu\times\nu}}\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\nu\times r}&0_{\nu\times\mu}}\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\\ =&\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\mu\times r}&0_{\mu\times \mu}}\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\\ =&\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\ , \end{align} which is the matrix in the centre of the right hand side of equation \eqref{eq1}. Thus, now putting \begin{align} N&\eqdef\pmatrix{I_{r\times r}&0_{r\times\nu}\\ 0_{\mu\times r}&0_{\mu\times\nu}}\\ M&\eqdef\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\ \end{align} and substituting the expression $\ N\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\nu\times r}&0_{\nu\times\mu}}M\ $ for the matrix $\ \pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\ $ into equation \eqref{eq1}, we finally have \begin{align} X^T&=PN\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\nu\times r}&0_{\nu\times\mu}}M\big(E^T)^{-1}\\ &=PN(EXP)M\big(E^T)^{-1}\\ &=AXB\ , \end{align} where $\ A\eqdef PNE\ $ , and $\ B\eqdef PM\big(E^T)^{-1}\ $.