A vector-product formula

29 Views Asked by At

Let $\mathbf a, \mathbf b,$ be vectors in $\mathbb R^3$ and let $R$ be a $3\times 3$ matrix. Then we have $$ ^t\!R\bigl(R\mathbf a\times R\mathbf b\bigr)=(\det R)(\mathbf a\times \mathbf b). \tag 1$$ The proof of (1) is using the very definition of the vector-product: we have $ \langle \mathbf a\times \mathbf b, \mathbf c\rangle=\det( \mathbf a, \mathbf b, \mathbf c) $ so that $$\langle ^t\!R\ (R\mathbf a\times R\mathbf b), \mathbf c\rangle =\langle R\mathbf a\times R\mathbf b, R\mathbf c\rangle=\det( R\mathbf a, R\mathbf b, R\mathbf c)=(\det R)\det( \mathbf a, \mathbf b, \mathbf c) =(\det R)\langle \mathbf a\times \mathbf b, \mathbf c\rangle, $$ which yields (1). My question: can anyone survive a "direct" proof of (1) by brute force, i.e. by calculating explicitly all terms of each side; I guess that a fine use of Einstein convention could help, but it seems to me that it is one more example that a little bit of abstraction would save you from an intractable computation.

1

There are 1 best solutions below

0
On

A basic "index-based" compuation is totally feasable. To make it nice, you need Einstein-sum-convention and also the "Levi-Civita-Symbol" $\epsilon_{ijk}$. See wikipedia for a bunch of nice formulas how to express cross-products and determinants with $\epsilon$. First expand everything:

\begin{align} (R^T(Ra \times Rb))_i &= R_{ji} (Ra \times Rb)_j \\ &= R_{ji} (Ra)_k (Rb)_l \epsilon_{jkl} \\ &= R_{ji} R_{km} a_m R_{ln}b_n\ \epsilon_{jkl} \end{align} and then reorder everything and group again: \begin{align} &= \left(R_{ji} R_{km} R_{ln}\epsilon_{jkl}\right) a_m b_n \\ &= \det(R)\epsilon_{imn} a_m b_n \\ &= \det(R)(a\times b)_i \end{align}