Why is the following product of tensor components represented by matrix multiplication involving a transpose?

489 Views Asked by At

In tensor notation the change of the electromagnetic field tensor by change of inertial reference frames can be done by the following formula :

$$F^{\alpha\beta} = \varLambda^{\alpha}_{\mu}\varLambda^{\beta}_{\nu}F^{\mu\nu}$$

But when this is represented by matrix multiplications it becomes:

$$F'=\varLambda F \varLambda^T$$ Where $ F'$ is the matrix representation of the tensor $F^{\alpha\beta}$, $F$ of $F^{\mu\nu}$ and $\varLambda$ of any of the two tensors $\varLambda^{\alpha}_{\mu}$ or $\varLambda^{\beta}_{\nu}$ that have the same components.

I guess that in the end i am asking how is matrix multiplication defined in tensorial form, or rather, when and how can i take an expression written in tensorial form and represent it by a matrix multiplication.

2

There are 2 best solutions below

3
On BEST ANSWER

In Linear algebra we are told that the multiplication of matrices $C=AB$ can be represent by the following equations of its entries $$ C_{ij} = A_{ik}B_{kj} $$ where the first indices represent the rows and the second indices for its columns for the entries. In your case its actually the same. The RHS of this equations

$$F'^{\alpha\beta} = \varLambda^{\alpha}_{\mu}\varLambda^{\beta}_{\nu}F^{\mu\nu}$$

can be interpret as multiplication of matrices too. You must know that $$\Lambda^{\mu}_{\nu} \quad \text{and} \quad F^{\mu\nu}$$ represent the entries of matrix $\Lambda$ and $F$, by regard the upper index $\mu$ as the row and the lower index $\nu$ as as column. So if you rearrange as $$ F'^{\alpha\beta} = (\varLambda^{\alpha}_{\mu}F^{\mu\nu}) \varLambda^{\beta}_{\nu} $$ You see that the first two represent $\Lambda F$. So if we write $M^{\alpha \nu} = \varLambda^{\alpha}_{\mu}F^{\mu\nu} $, then we have $$ F^{\alpha\beta} = M^{\alpha \nu} \varLambda^{\beta}_{\nu} $$ Now to makes sense the multiplication above, note that $M^{\alpha \nu}= (M^T)^{\nu \alpha}$ (and so for any matrix) so $$ F'^{\alpha\beta} = \varLambda^{\beta}_{\nu} (M^T)^{\nu \alpha}= (\Lambda M^T)^{\beta \alpha} = (\Lambda F^T \Lambda^T)^{\beta \alpha} = ((\Lambda F^T \Lambda^T)^T)^{\alpha \beta} = (\varLambda F \varLambda^T)^{\alpha \beta} $$

$\textbf{Edit :}$

We can also look at this by the following way. To makes sense of this equation $$ F'^{\alpha\beta} = M^{\alpha \nu} \varLambda^{\beta}_{\nu}, $$ we can regard $\varLambda^{\beta}_{\nu} = (\Lambda^T)_{\nu}^{\phantom{x}\beta}$. So now the index $\nu$ represent the row and $\beta$ represent column. So, $$ F'^{\alpha\beta} = M^{\alpha \nu} (\varLambda^T)_{\nu}^{\phantom{x}\beta} = \varLambda^{\alpha}_{\mu}F^{\mu\nu} (\varLambda^T)_{\nu}^{\phantom{x}\beta} $$ This is just $F'=\Lambda F \Lambda^T$. I choose not to use this interpretation at first because in think this can be confusing (about the upper and lower indices of $\Lambda$ and its transpose $\Lambda^T$), but this is more simple and direct as demostrated by many physics text.

2
On

The issue has to do with which convention are you using on indexation.

If we are not care about it, multiplication of matrices looks always as $$(AB)_{ik}=A_{is}B_{sj},$$ with $s$ summation or other combinations as $$(AB)^i{}_k=A^{is}B_{sk}.$$

The core matter is that the second index of the first matrix coincides with the first index of the second matrix. And always having in mind which index is first (rows) and which is second (columns).

So, for your problem with a better index technique $$F’^{\alpha\beta}= \Lambda^{\alpha}{}_{\mu}\Lambda^{\beta}{}_{\nu}F^{\mu\nu}$$ $$= \Lambda^{\alpha}{}_{\mu}F^{\mu\nu}\Lambda^{\beta}{}_{\nu}$$ $$= (\Lambda^{\alpha}{}_{\mu}F^{\mu\nu}) (\Lambda_{\nu}{}^{\beta})^{\top},$$ i.e. $F’=\Lambda F \Lambda^{\top}$.