Proof of a Vector Identity Using Index Notation

2.4k Views Asked by At

I'm having a hard time proving this vector identity:

$$(A \cdot (B \times C))D = (C \cdot D)(A \times B) + (A \cdot D)(B \times C) + (B \cdot D)(C \times A)$$

Please note: I was hoping to prove this using index notation. Not any other math methods. I mean, by using index notation with Kronecker Delta or Epsilon symbols.

Could someone help me about this? I have no idea how to proceed.

3

There are 3 best solutions below

26
On BEST ANSWER

Linear Independent Case

$1)$ If $A$, $B$, and $C$ are three independent vectors then $A \times B$, $B \times C$, and $C \times A$ are also independent so they can form a basis for $\mathbb{R}^3$. So we can write any vector as a linear combination of them.

$2)$ According to step $(1)$ we have

$$(A \cdot (B \times C))D = \alpha (A \times B) + \beta (B \times C) + \gamma (C \times A)$$

where $\alpha$, $\beta$, and $\gamma$ are unknown coefficients. If we dot product with $A$, $B$, and $C$, respectively, we can get

$$\eqalign{ & \left( {A \cdot (B \times C)} \right)\left( {A\cdot D} \right) = \beta A\cdot (B \times C) \cr & \left( {A \cdot (B \times C)} \right)\left( {B \cdot D} \right) = \gamma B \cdot (C \times A) \cr & \left( {A \cdot (B \times C)} \right)\left( {C \cdot D} \right) = \alpha C \cdot (A \times B) \cr} $$

$3)$ We note that

$$A \cdot (B \times C) = B \cdot (C \times A) = C \cdot (A \times B)$$

$4)$ We finally conclude that $$\eqalign{ & \alpha = C \cdot D \cr & \beta = A \cdot D \cr & \gamma = B \cdot D \cr} $$

Linear Dependent Case

Consider the case when the vectors $A$, $B$, and $C$ are linearly dependent and hence they cannot form a basis for $\mathbb{R}^3$. This means that there exists real numbers $a$ and $b$ not both zero such that $A = aB + bC$ will hold. This will lead to $(A \cdot (B \times C))=0$ and turns our equation into a trivial identity $0=0$.

2
On

H.R. has given the answer for the case when $A,B,C$ are linearly independent. In $\mathbb{R}^3$ if $A,B,C$ are linearly dependent, then:

$(1).$ If $B,C$ are linearly dependent then $A,B,C$ are co-linear and $0=A\times B=B\times C=C\times A$.

$(2).$ If $B,C$ are linearly independent, let $A+A'=A''$ be linearly independent of $B,C$ and write the equations for linearly independent $A'',B,C$ and for linearly independent $A',B,C$ and subtract them.

0
On

Since there is a request to show this using index notation (i.e. in terms of co-ordinates), observe that if the equation holds for each of $A_1,A_2,$ and $A_3$ then it holds for any linear combination of $A_1,A_2,A_3$, and we make a similar observation regarding $B, C$ and $D$ . Since $(u,v,w)\cdot (u',v',w')=(u u',v v',w w')$ and $(u,v,w)\times (u',v',w')=(v w'-v' w,w v'-w' v,u v'-u'v)$, writing out the whole equation would give a huge formula which would eventually reduce to $0=0$ if you are a computer. But using the observation, we can see that it suffices to verify the equation when $$\{A,B,C,D\} \subset \{(1,0,0),(0,1,0),(0,0,1)\}.$$ This gives 81 cases to check but we can reduce much further by observing that the equation holds for any $A,B,C,D$ iff it holds for $$f(A),f(B),f(C),f(D)$$ $$ \text { and for } f(f(A)),f(f(B)),f(f(C)),f(f(D))$$ $$\text {where } f(x,y,z)=(y,z,x)$$ because $f$ will just permute the co-ordinates, on both sides ot the equation, in the same way. So we need only to consider $A=(1,0,0)$.This gives 27 cases. Further if $(A=B)\lor (B=C )\lor (C=A)$ then the original equation is easily verified,because of the properties $U\times V=-V\times U$ and $U\times U=0$ for any vectors $U,V$.Still further,these properties show that the original equation holds iff it holds when we interchange $B$ with $C$. So it suffices to consider $$A=(1,0,0), B=(0,1,0),C=(0,0,1).$$ At this point we have $$(A\times B=C)\land (B\times C=A)\land (C\times A=B)$$ $$\text {and } (A\cdot (B\times C)D=(A\cdot A)D=D.$$ Now we can handle all $D$ at once, by confirming that $$D=(C\cdot D)C+(A\cdot D)A+(B\cdot D)B$$ $$\text {when } A=(1,0,0),B=(0,1,0),C=(0,0,1).$$ Which should be obvious.