I'm finding it difficult to wrap my head around how the transpose operation works for Tensors of Rank 3 and above.
Here's an example in PyTorch
I was doing a transpose of tensors of rank 3 and according to transpose rule for rank 2 tensors which follow simple 2D matrix transpose rule. $$ {A_{ij}}^T =A_{ji} $$
But when I transposed a rank 3 tensor I ended up with a different output given below.
Can someone explain to me how is this happening?
a = torch.tensor(
[[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]],
[[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]],
[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]]])
Printing the tensor and the shape gives the following:
print(a)
print(a.shape)
tensor([[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]],
[[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]],
[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]]])
torch.Size([3, 3, 3])
Transposing with a.T we get
tensor([[[1., 0., 1.],
[1., 0., 1.],
[1., 0., 1.]],
[[1., 0., 1.],
[1., 0., 1.],
[1., 0., 1.]],
[[1., 0., 1.],
[1., 0., 1.],
[1., 0., 1.]]])
Edit:
As suggested I'm adding another example with unique values for every entry.
Tensor b
tensor([[[ 1., 2., 3.],
[ 4., 5., 6.],
[ 7., 8., 9.]],
[[10., 11., 12.],
[13., 14., 15.],
[16., 17., 18.]],
[[19., 20., 21.],
[22., 23., 24.],
[25., 26., 27.]]])
Transpose of Tensor b ( b.T)
tensor([[[ 1., 10., 19.],
[ 4., 13., 22.],
[ 7., 16., 25.]],
[[ 2., 11., 20.],
[ 5., 14., 23.],
[ 8., 17., 26.]],
[[ 3., 12., 21.],
[ 6., 15., 24.],
[ 9., 18., 27.]]])
```
It looks like this is happening.
Suppose the tensor components are $a_{ijk}$ where $i$ tells you which matrix you're talking about, $j$ says which row you're talking about, and $k$ says which column you're talking about. The transpose you presented then has components $a_{kji}$, so transpose seems to reverse the indices, or at least swapp first/last indices.
Of course your $jk$ indices can be swapped without changing $a_{ijk}$ which means $a_{jki}$ is another possibility; you should have made the rows/columns in your choice of $a$ distinguishable so you can better see what the operation is doing.