I think this thread seems to say no
Can the matrix transpose be represented by $X^T = AXB$ for a given $A$ and $B$?
What I am asking is that given X, do there exist any number of matrices, A1, A2,etc and multiplications such that
$\mathbf{X}^\intercal = ((\mathbf{X} * \mathbf{A1}) * \mathbf{A2}) . . . $
or to get even more complex
$\mathbf{X}^\intercal = f(X,A1,A2...)$ where the function ,f is any combination of matrices and multiplications.
Why do I ask this? Because thelinear combination of matrices reminds me of the models used in deep learning. And so I began to wonder if a symbolic procedure (like matrix transposition) could be 'learned' by a deep learning model. Certainly I can try this. I already generated a million random 3x3 matrices with their transposes to use as a training set. I am uncertain as to how to structure the model (how many layers, etc) but there is no reason to suspect that such a model cannot be trained. Then I could give it any matrix and it would 'infer' the transpose. And this is where my brain fails me.
So if a deep learning model can provide an affirmative answer to this question whereas logical analysis says it is impossible, I am confronted with an interesting contradiction. Deep learning can solve a problem that logic cannot. But first I have to KNOW that matrix transposition CANNOT be symbolically represented as a combination of matrix multiplications.
I don't know any deep learning stuffs, but this is very basic linear algebra. Suppose $X$ is $n\times m$. Then $X^T=\sum_{i=1}^n\sum_{j=1}^mE_{ji}XE_{ji}$ where $E_{ji}$ denotes the $m\times n$ matrix whose only nonzero entry is a $1$ at the $(j,i)$-th position. Thus the transposition of an $n\times m$ matrix is a rank-$nm$ tensor.