Understanding non-commutativity of tensor product when tensors are interpreted as R-valued linear maps

95 Views Asked by At

In the textbook Geometric Control of Mechanical Systems by Francesco Bullo and Andrew D. Lewis, I find the following:

enter image description here

I am struggling to reconcile the definition of the tensor product given here with the statement that the tensor product is not commutative. It appears that the tensor product is defined by multiplying the image of $t_1$ with the image of $t_2$ (i.e. multiplying a real number by a real number) but isn't multiplication on the real numbers commutative?

I have a hypothesis and my goal with this question is to confirm whether it is correct. My hypothesis as to why these two facts agree is that $t_1 \otimes t_2$ and $t_2 \otimes t_1$ are different tensors (e.g. they have different components) but given the same inputs and somehow keeping track of which inputs went to $t_1$ and $t_2$, they both map to the same real number.

In other words, in the case where $t_1 \in T^1_0(V)$ and $t_2 \in T^0_1(V)$ then $t_1 \otimes t_2, t_2 \otimes t_1 \in T^1_1(V)$ can be thought of as 2D matrices given a basis on V. These matrices have the same components, but they are rearranged such that the inputs are, for lack of better phrasing, still being fed to their corresponding tensor. Therefore $t_1 \otimes t_2$ and $t_2 \otimes t_1$ are different tensors (their components are not the same), but the real number to which they are mapping a particular vector and covector input stays the same in some sense.

My background is in mechanical engineering and I never had the privilege of a proper course in multilinear algebra.

1

There are 1 best solutions below

4
On

I think you are overthinking this. Just apply the definition. If we want $$ t_1\otimes t_2 \overset?= t_2\otimes t_1 $$ then that means we need $$ (t_1\otimes t_2)(v,w) \overset?= (t_2\otimes t_1)(v,w) $$ for all vectors $v,w$, and applying the definition of the tensor product shows that we need $$ t_1(v)t_2(w) \overset?= t_2(v)t_1(w). $$ What reason is there for these two expressions to be equal?

Now, if we choose a vector basis $e_i$ then multilinearity tells us that tensors are determined by there values on this basis. So define $$ T_{ij} = (t_1\otimes t_2)(e_i, e_j),\quad S_{ij} = (t_2\otimes t_1)(e_i, e_j). $$ It's easy enough to see that $T_{ij} = S_{ji}$, but this doesn't mean that they are the same tensor anymore than a matrix is the same as its transpose.