Proof of Vector Identity using Summation Notation

393 Views Asked by At

I am working through the following problem:

Prove: $ (A \cdot B)^{2} + (A \times B) \cdot (A \times B) = |A|^{2}|B|^{2} $ using summation notation.

I just wanted to check my understanding on the following parts of the proof:


Question 1:

We can write $ A \cdot B = a_i \hat e_i \cdot b_j \hat e_j = a_ib_i\epsilon_{ijk}$ (1)

Then applying this to $ (A \times B) \cdot (A \times B) $ we get:

$ (A \times B) \cdot (A \times B) = a_ib_j\epsilon_{ijk} \hat e_k \cdot a_ib_j\epsilon_{ijk} \hat e_k $

In my book however, it instead chooses to make the following notational choice:

$ (A \times B) \cdot (A \times B) = a_ib_j\epsilon_{ijk} \hat e_k \cdot a_mb_n\epsilon_{mnp} \hat e_p $

Why does it do that?


Question 2:

Now running with what the book has:

$ (A \times B) \cdot (A \times B) = a_ib_j\epsilon_{ijk} \hat e_k \cdot a_mb_n\epsilon_{mnp} \hat e_p = a_ia_mb_jb_n\epsilon_{ijk}\epsilon_{mnp} (\hat e_k \cdot \hat e_p) = a_ia_mb_jb_n\epsilon_{ijk}\epsilon_{mnp} $

Does the last step follow from the fact that since $k = p$ therefore $ \hat e_k \cdot \hat e_p = \delta_{kp} = 1?$

Any guidance is much appreciated.

edit:

The following was a typo that was fixed in (1):

$ A \cdot B = a_i \hat e_i \times b_j \hat e_j = a_ib_i\epsilon_{ijk}$

became:

$ A \cdot B = a_i \hat e_i \cdot b_j \hat e_j = a_ib_i\epsilon_{ijk}$

2

There are 2 best solutions below

1
On
  1. In summation convention, you can't have more than two of a particular letter: one is a free index, two tells you to sum. But you can't have three. So when you multiply two of the same expression together, you have to change the indices on one. Since indices that are summed over are bound variables, the actual letter used is otherwise unimportant, i.e. $a_i b_i = a_j b_j$ and so on.

(If you're wondering why you can't have more than two, you can think of it as a much more general version of matrix multiplication, where $ [AB]_{ij} = \sum_{k} [A]_{ik} [B]_{kj} $, so multiplying by a term with the same index "uses it up". All such summation convention index manipulations can be thought of in this way, but we don't normally to save time.)

(You also need $$ A \times B = (a_i \hat{e}_i) \times (b_j \hat{e}_j) = a_i b_j (\hat{e}_i \times \hat{e}_j) = a_i b_j \epsilon_{ijk} \hat{e}_k , $$ rather than "$\cdot$", as you wrote in the first line, but presumably that was just a typo.)

  1. You've got this a bit backwards. You've got to $$ a_i b_j \epsilon_{ijk} a_m b_n \epsilon_{mnp} (\hat{e}_k \cdot \hat{e}_p). $$ Now $ \hat{e}_k \cdot \hat{e}_p = \delta_{kp} $ (because the $\hat{e}_i$ are an orthonormal basis: that's what this means), so you then have $$ a_i b_j \epsilon_{ijk} a_m b_n \epsilon_{mnp} (\delta_{kp}) = a_i b_j \epsilon_{ijk} a_m b_n \epsilon_{mnk} $$ because the $\delta$ sets the two indices equal (rather than the implication being the other way, like you have).
1
On

Question 1:

The convention is as follows: a single index variable appearing twice in the same term implies sum. The key word there is "twice", that is the meaning of $a_ib_i$ is

$$ a_ib_i = a_1b_1 + a_2b_2 + \cdots = a_j b_j = a_k b_k $$

but the term $a_ib_ic_i$ has no meaning under Einstein's notation. That's why you should avoid using the same label more than once.

Going back to your book, this implies

$$ ({\bf A} \times {\bf B})\cdot ({\bf A} \times {\bf B}) = (a_i b_j \epsilon_{ijk} \hat{\bf e}_k) \cdot (a_m b_n \epsilon_{mnp} \hat{\bf e}_p) = \cdots $$

Question 2

To make it simple consider this

$$ {\bf A}\cdot {\bf B} = (a_i \hat{\bf e}_i) \cdot (b_j \hat{\bf e}_j) = a_i b_i (\hat{\bf e}_i \cdot \hat{\bf e}_j) \tag{1} $$

Once you expand Eq (1) using Einstein's notation you will see term of the form $(\hat{\bf e}_1 \cdot \hat{\bf e}_1) = 1$, $(\hat{\bf e}_1 \cdot \hat{\bf e}_2) = 0$, $(\hat{\bf e}_1 \cdot \hat{\bf e}_3) = 0$, $\cdots$, $(\hat{\bf e}_2 \cdot \hat{\bf e}_2) = 1$, $\cdots$, it is easy to see the pattern here: only terms for which the indices are the same will survive, the rest vanish. Or in other words $$ \hat{\bf e}_i \cdot \hat{\bf e}_j = \delta_{ij} $$

So Eq (1) becomes

$$ {\bf A}\cdot {\bf B} = (a_i \hat{\bf e}_i) \cdot (b_j \hat{\bf e}_j) = a_i b_i (\hat{\bf e}_i \cdot \hat{\bf e}_j) = a_i b_i \tag{2} $$