Can it be show geometrically the cross product is the area of a parallelogram and is perpendicular to the crossed vectors?

264 Views Asked by At

I found similar questions but they did not exactly derive the area of a parallelogram using the definition and at the same time show the cross product vector must necessarily be perpendicular. ( As a side note I assume this new vector has to lie at point joining the crossed vectors but I cannot understand how any proof can demand this but somehow this must be the case by necessity. ) Maybe too much to ask. A one stop shop proof of all the details so one does not have to look in bits and pieces of the proof in multiple textbooks may be too much to ask for.

2

There are 2 best solutions below

6
On BEST ANSWER

I take the algebraic definition of the cross product to be $$ (a \times b)_i = \sum_{j,k=1}^3 \epsilon_{ijk} a_j b_k, $$ where $\epsilon_{ijk}$ is the Levi-Civita symbol, defined to be $0$ when two of $i,j,k$ are equal, $1$ when $ijk$ is one of $123,231,312$ and $-1$ when $ijk$ is one of $213,132,321$ (the nonzero cases being respectively even and odd permutations of $123$). In particular, $\epsilon_{ijk}$ is totally antisymmetric, in that swapping any two of $i,j,k$ returns the negative of the original value: $$ \epsilon_{ijk} = \epsilon_{jki} = \epsilon_{kij} = -\epsilon_{jik} = -\epsilon_{ikj} = -\epsilon_{kji}. $$ This gives the familiar $(a\times b)_1 = a_2b_3-b_3a_2 $ and so on.

With this definition, it is also clear that $a \times b$ is perpendicular to $a$ and $b$: $$ a \cdot (a\times b) = \sum_{i,j,k=1}^3 \epsilon_{ijk} a_ia_j b_k = 0 $$ since $a_ia_j$ is symmetric and $\epsilon_{ijk}$ antisymmetric when we swap $i$ and $j$.

Lastly, we would like to find the magnitude and direction. We suppose $a,b \neq 0$ for obvious nontriviality reasons. We can actually find both using the same formula: three vectors form a right-handed basis of $\mathbb{R}^3$ if and only if the determinant of the matrix with them as columns is positive: $$ \det \begin{pmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{pmatrix} > 0 $$ But this determinant is $$ \sum_{i,j,k} \epsilon_{ijk} a_i b_j c_k = (a \times b) \cdot c. $$ So if $c=a \times b$, the determinant is $\lVert a \times b \rVert^2 > 0 $. So the determinant is positive, and thus $a \times b$ is the third vector of a right-handed basis with first two elements $a$ and $b$.

To actually calculate this, we need the following identity: $$ (a \times b) \times c = -a(b \cdot c) + b(a \cdot c). $$ This can be proven using my favourite $\epsilon$ identity: $$ \epsilon_{ijk}\epsilon_{lmn} = \det\begin{pmatrix} \delta_{il} & \delta_{im} & \delta_{in} \\ \delta_{jl} & \delta_{jm} & \delta_{jn} \\ \delta_{kl} & \delta_{km} & \delta_{kn} \end{pmatrix} $$ The proof is rather unenlightening, unfortunately: both sides are zero if two of the indices on one $\epsilon$ are equal (and two rows of the determinant are equal), the symmetries of the left and right-hand sides are the same by the properties of determinants, and then one can just check on one set of indices. But this also means, by expansion, that $$ \sum_k \epsilon_{ijk}\epsilon_{kmn} = \delta_{im} \delta_{jn} - \delta_{in} \delta_{jm}. $$ But then $$ (a \times b) \cdot (a \times b) = \sum_{i,j,k,m,n}\epsilon_{ijk}\epsilon_{kmn} a_ib_j a_m d_n = (a \cdot a)(b \cdot b) - (a\cdot b)^2 = \lVert a \rVert^2\lVert b\rVert^2(1-\cos^2{\theta}) = \lVert a \rVert^2\lVert b\rVert^2\sin^2{\theta}, $$ which is the other formula for the magnitude of the cross product; since the length is nonnegative and $0 \leq \theta \leq \pi$, the sign of the square root is unambiguous.

2
On

One quick proof is to use an orthogonal transformation to assume that the vectors are $(a,0,0)$ and $(b,c,0)$ their cross product is then $(0,0,ac)$ which is orthogonal and $ac$ is the base times height of the parallelogram.