Does Cross Product Makes Rank-1 Tensor??

293 Views Asked by At

The problem itself is very simple. I have to prove that the cross product of two vectors, $C=A \times B$, satisfies vector transformation and thus rank-1 tensor. In other words, I have to prove that

$C'_i = \lambda_{ij} C_j$

where

$C_j=\varepsilon_{jmn}A_mB_n$

and

$C_i'=\varepsilon_{ikl}A_k'B_l'=\varepsilon_{ikl}(\lambda_{km}A_m)(\lambda_{ln}B_n)$

I can clearly see that this statement is true when I draw the diagram of each vector $C, C', A, B, A', B'$, but failed to prove it with algebra. I've tried putting 1, 2, 3 into each index, and failed with miserably long lines. Also tried to alter $i, k, l$ into $j,m,n$ as follows. I think this one is the most probable trial.

$\varepsilon_{ikl}\lambda_{km}\lambda_{ln}= \varepsilon_{jmn}\lambda_{mm}\lambda_{nn}+\varepsilon_{jnm}\lambda_{nm}\lambda_{mn}+ \varepsilon_{mjn}\lambda_{jm}\lambda_{nn}+\varepsilon_{mnj}\lambda_{nm}\lambda_{jn}+ \varepsilon_{nmj}\lambda_{mm}\lambda_{jn}+\varepsilon_{njm}\lambda_{jm}\lambda_{mn} $

which simplifies to

$\varepsilon_{ikl}\lambda_{km}\lambda_{ln}=\varepsilon_{jmn} (\lambda_{mm}\lambda_{nn}-\lambda_{nm}\lambda_{mn} -\lambda_{jm}\lambda_{nn}+\lambda_{nm}\lambda_{jn} -\lambda_{mm}\lambda_{jn}+\lambda_{jm}\lambda_{mn})$

I expected this to be something like $\varepsilon_{jmn}(\lambda_{jj}+\lambda_{mj}+\lambda_{nj})$so that

$C_i'=\varepsilon_{ikl}A_k'B_l'=\varepsilon_{jmn}(\lambda_{jj}+\lambda_{mj}+\lambda_{nj})A_mB_n= \varepsilon_{jmn}\lambda_{ij}A_mB_n$ But I always fail to convert $\lambda_{km}\lambda_{ln}$ into $\lambda_{ij}$.

Can somebody help me?

1

There are 1 best solutions below

1
On BEST ANSWER

In a coordinate free formulation, you have the cross product defined as $$ \forall v\in\Bbb R^3:\langle a\times b,v\rangle=\det(a,b,v) $$ Applying a coordinate transformation $\Lambda$ then has the effect that $$ ⟨Λa×Λb,Λv⟩=\det(Λa,Λb,Λv)=\det(Λ)·\det(a,b,v)=\det(Λ)·⟨a×b,v⟩ $$ Consequently $$ \det(Λ)(a×b)=Λ^T(Λa×Λb) $$ This only simplifies to the usual formula if $Λ$ is a special orthogonal transformation, $\det(Λ)=1$ and $Λ^T=Λ^{-1}$. So if you want to reconstruct that with the explicit coordinates, you have to use these relations somewhere.

Note that $\det(Λ)Λ^{-T}$ is the co-factor matrix of $Λ$.