What exactly are operations involving tensors... In terms of their indices

411 Views Asked by At

So I have heard that tensor operations involve the faces of the rectangular prism. These are matrices right, and different properties of those matrices say things about the tensor? Could someone explain this better?

Also, I have heard of the "tensor product". What is it exactly in terms of the elements within the matrix. I usually see strange equations and things. Does it break down into matrix multiplication like how matrix multiplication is vector multiplication appended upon itself? Does 1 dimensional tensor products (matrix multiplication) remain the same or is it different in some way?

Do tensors have "row" operations like matrix row reduction? What about inverses, determinants, eigenvalues, eigenvectors, etc.

Overall what do tensors have in common with matrices and how do their operations vary?

1

There are 1 best solutions below

1
On BEST ANSWER

I'll try to give an answer to this question that doesn't require knowing what a tensor is. Of course it will require some comfort with linear algebra, but I'll even try to keep the multilinear algebra to a minimum.

Bye_World's answer in the comments is a good one, although there is slightly more that you can say: a matrix is naturally identified with a rank-two tensor with one covariant and one contravariant argument. (It is somewhat unfortunate that vectors and covectors are so similar, since it's not terribly easy to explain why the 'X-variant' qualifications are necessary.)

Matrix multiplication is intended to model composition of linear endomorphisms (that is, linear maps $V\to V$). Since (real) tensors are roughly supposed to be multilinear maps $V^n\to \Bbb R$, it's not terribly surprising that there's no natural notion of "tensor multiplication": it's just not possible to compose tensors in the same way that it is possible to compose matrices, because the domain and the codomain are so radically different.

[ This is slightly disingenuous, since matrices-as-tensors are $V^2\to\Bbb R$. Handwaving a lot, the idea is that the contravariance of one component allows you to take one copy of $V$ and "slide it across the arrow". But since the arrow only has two sides, you can't do this for $V^3\to\Bbb R$, for instance, to get the same domain and codomain. ]

Basically all of the other notions of matrix operations "morally should not" carry over for the same reason: row operations preserve the kernel (or nullspace) of the matrix, a concept that is trivial for tensors because $\Bbb R$ is one-dimensional. Determinants talk about invertibility, which again makes no sense. Eigenstuff comes from the equation $Ax=\lambda x$, which doesn't makes sense. (Although, if we assume the $V$ is an inner product space, then we can rewrite the equation as $A(x,x^*)=\lambda||x||$, and maybe you can do something with that?)

There is a notion of a tensor product, but this product is not (a priori) a product of tensors, but a product of tensor spaces, in much the same way that the direct sum is not (a priori) a sum of vectors but a sum of vector spaces.

[ It's worth noting that the tensor product is a genuinely more complicated operation in the following sense: you can back-define a notion of 'direct sum of vectors' as an operation $V\times W\to V\oplus W$ which sends $(v,w)\mapsto (v,0)+(0,w)$; similarly you can back-define a notion of 'tensor product of tensors' as an operation $V\times W\to V\otimes W$ which sends $(v,w)\mapsto (v,0)+(0,w)$. But the tensor product $V\otimes W$ has many tensors which are not of this form, unlike the direct sum. More succinctly, the direct sum of vectors is surjective, but the tensor product of tensors is not. ]