Decomposition of Tensor into a Product of Tensors

1k Views Asked by At

I was working on a text that made use of the assumption that, for some rotation matrix $a_{ij}$, and some tensor $U$, the tensor under rotation is represented by $U'_{\alpha}=a_{\alpha i}U_i$. It made use of this (and the outer product) to prove that: $$ T'_{\alpha\beta\gamma\delta}=a_{\alpha i}a_{\beta j}a_{\gamma k}a_{\delta l}T_{ijkl} $$

Where $T_{ijkl}=U_{ij}\otimes V_{kl}$.

Is there a simple proof to show that any tensor can be decomposed into a product of $n$-tensors? (as it seems that the proof given above heavily relies on this to be the case in order to work)

Thank you.

1

There are 1 best solutions below

4
On BEST ANSWER

No, you need the tensor product $\otimes$ to construct the most general tensors. The wedge product $\wedge$ produces skew-symmetric objects.


Well, the space of tensors on a vector space V is constructed by taking tensor products of the form $V \otimes V \otimes \ldots V \otimes V^* \otimes V^* \ldots V^*$ where $V$ is the vector space and $V^*$ is its dual space. So an (m,n) type tensor can always be written as $$ \mathbf{T} = T^{ijk\ldots}_{pqr\ldots} \hat{e}_{i} \otimes \hat{e}_{j} \ldots \otimes \hat{f}^{p} \otimes \hat{f}^{q} \ldots $$ in terms of the basis vectors and co-vectors. So, if you insist on strict decomposition of a tensor, that's not possible always. You have to allow for linear combinations of the elements, as shown in the formula above. And that's just for decomposition into vectors / co-vectors.

A tensor of the form you've used here (T) is a very special case. A general (0,4) tensor cannot be written as the product of two (0,2) tensors.