How to differentiate the following interesting vector product?

413 Views Asked by At

How do we differentiate the following vector product with respect to $\boldsymbol r$. \begin{equation} \frac{d}{d\boldsymbol r}\bigg[(\boldsymbol \omega \times\boldsymbol r)\cdot (\boldsymbol \omega \times\boldsymbol r)\bigg] \end{equation} I know that the answer is $-\boldsymbol \omega\times(\boldsymbol\omega\times\boldsymbol r)$. Here is my "not so great" working that lead me to a resemblance of the Lagrange identity. \begin{equation} \frac{d}{d\boldsymbol r}\bigg[(\boldsymbol \omega\times \boldsymbol r)\cdot (\boldsymbol \omega\times\boldsymbol r)\bigg]=\frac{d}{d\boldsymbol r}(\boldsymbol\omega \times\boldsymbol r)\cdot (\boldsymbol \omega\times\boldsymbol r)+(\boldsymbol \omega \times\boldsymbol r)\cdot \frac{d}{d\boldsymbol r}(\boldsymbol \omega\times\boldsymbol r) \end{equation} \begin{equation} =\bigg[\frac{d}{d\boldsymbol r}\boldsymbol\omega\times\boldsymbol r\bigg]\cdot \bigg[\boldsymbol\omega\times\boldsymbol r\bigg]+\bigg[\boldsymbol\omega\times\boldsymbol r\bigg]\cdot\bigg[\frac{d}{d\boldsymbol r}\boldsymbol\omega\times\boldsymbol r\bigg] \end{equation} \begin{equation} =2\bigg\{\bigg[\frac{d}{d\boldsymbol r}\boldsymbol\omega\cdot\boldsymbol\omega\bigg]\bigg[\boldsymbol r\cdot \boldsymbol r\bigg]-\bigg[\frac{d}{d\boldsymbol r}\boldsymbol\omega\cdot \boldsymbol r\bigg]\bigg[\boldsymbol r\cdot \boldsymbol \omega\bigg]\bigg\} \end{equation} Thank you,

3

There are 3 best solutions below

9
On BEST ANSWER

If you know what the Kronecker delta $\delta_{ij}$ and the Levi-Civita symbol $\epsilon_{ijk}$ are, there's an even simpler method than that presented by Kelechi Nze above/below.

If this is your first time with these symbols then what I'm going to show you below may look complicated but, believe me, it's a simple and very powerful method, well worth learning. It will help you prove complicated vector identities in no time.

The derivation below looks long only because I'm being quite detailed. Once you get the hang of this method, you can do these things in just a couple of lines.

As it happens, the scalar product of two vectors $A$ and $B$ can be written as

$$A \cdot B = \sum_{i\,=\,1}^{3} A_{i}\,B_{i} = \sum_{i\,=\,1}^{3} \sum_{j\,=\,1}^{3} \delta_{ij}\,A_{i}\,B_{j} $$

but we'll write it as

$$A \cdot B = \delta_{ij}\,A_{i}\,B_{j} $$

where a sum over repeated indices in the same side of an equation is implied (the so-called Einstein summation convention). The indices represent components.

Likewise, the vector product $A \times B$, in 3 dimensions, can be written as

$$(A \times B)_{i} = \epsilon_{ijk}\,A_{j}\,B_{k}$$

again with a sum over repeated indices implied. Note that $i$ is not repeated on the LHS nor is it repeated on the RHS, so there's no sum over $i$ implied. (L/RHS = left/right-hand side)

Using the above, we can write the expression you want to compute as such:

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big]$$

Note that $i,j,k,r,s$ are all repeated but there is a free index $m$ left over. So,

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big] = (\epsilon_{ijk}\,\omega_{j}\,\frac{dr_{k}}{dr_{m}})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) + (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\frac{dr_{s}}{dr_{m}}) $$

But $\frac{dr_{k}}{dr_{m}} = \delta_{km}$ and $\frac{dr_{s}}{dr_{m}} = \delta_{sm}$ (it's the general form of $dx/dy = dx/dz = 0$ and $dx/dx = 1$) so

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big] = (\epsilon_{ijk}\,\omega_{j}\,\delta_{km})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) + (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,\delta_{sm}) $$

Since a sum over $k$ and a sum over $s$ are implied and $\delta_{km}$ is 0 when $k \ne m$ but 1 when $k = m$ (likewise for $s$), we have

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big] = (\epsilon_{ijm}\,\omega_{j})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) + (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irm}\,\omega_{r}) $$

Now, repeated indices are dummy indices so we can rename them at will (carefully, of course) so we can make the second expression on the right more like the first by exchanging $k$ with $s$ and $r$ with $j$ in the second term:

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big] = (\epsilon_{ijm}\,\omega_{j})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) + (\epsilon_{irs}\,\omega_{r}\,r_{s})\, (\epsilon_{ijm}\,\omega_{j}) $$

We now see that they're identical, so

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big] = 2\,(\epsilon_{ijm}\,\omega_{j})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) $$

Now we convert the result back to the "normal" vector notation:

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big] = 2\,(\epsilon_{ijm}\,\omega_{j})\, (\omega \times r)_{i} $$

It turns out you can cyclically rotate the indices of $\epsilon$ without changing its sign, so $\epsilon_{ijm} = \epsilon_{mij}$ and

$$\frac{d}{dr_{m}}\Big[\, (\epsilon_{ijk}\,\omega_{j}\,r_{k})\, (\epsilon_{irs}\,\omega_{r}\,r_{s}) \,\Big] = 2\,(\epsilon_{mij}\,\omega_{j})\, (\omega \times r)_{i} = 2\,\epsilon_{mij}\,(\omega \times r)_{i}\,\omega_{j} = 2\,((\omega \times r) \times \omega)_{m} $$

Thus,

$$\frac{d}{dr}\Big[\, (\omega \times r) \cdot (\omega \times r) \,\Big] = 2\,((\omega \times r) \times \omega) $$

Now, as we all know, the vector product is anti-commutative, so $A \times B = - B \times A$. Hence,

$$\frac{d}{dr}\Big[\, (\omega \times r) \cdot (\omega \times r) \,\Big] = -2\,\omega \times (\omega \times r) $$

(Alternatively, we could have used the fact that $\epsilon_{mij} = -\epsilon_{mji}$ in a previous step)

Like I said, this looks complicated at first but really isn't and is a very powerful method.

One last thing: sometimes you'll need the following result, which is a pain to prove on its own, and worth memorising:

$$\epsilon_{ijk}\epsilon_{irs} = \delta_{jr}\,\delta_{ks} - \delta_{js}\,\delta_{kr}$$

There's a sum only over $i$, and it's the first index in both $\epsilon$ symbols.

1
On

Here's an (ugly) method:

First write $$\boldsymbol{w}=\begin{pmatrix}w_1\\w_2\\w_3 \end{pmatrix}, \boldsymbol{r} = \begin{pmatrix}r_1\\r_2\\r_3 \end{pmatrix}$$

Then compute \begin{equation} (\boldsymbol w \times\boldsymbol r)\cdot (\boldsymbol w \times\boldsymbol r) = w_2^2r_3^3 + w_3^2r_2^2 - 2w_2r_3w_3r_2 \\ + w_1^2r_3^3 + w_3^2r_1^2 - 2w_1r_3w_3r_1 \\ + w_1^2r_2^3 + w_2^2r_1^2 - 2w_1r_2w_2r_1 \end{equation}

Assuming that $\boldsymbol{w}$ is independent of $\boldsymbol{r}$ we can now calculate $\frac{d}{d\boldsymbol{r}}\bigg[(\boldsymbol \omega \times\boldsymbol r)\cdot (\boldsymbol \omega \times\boldsymbol r)\bigg]$ with ease.

$$\frac{d}{d\boldsymbol{r}}\bigg[(\boldsymbol \omega \times\boldsymbol r)\cdot (\boldsymbol \omega \times\boldsymbol r)\bigg] = \frac{d}{d\boldsymbol{r}} \bigg[w_2^2r_3^3 + w_3^2r_2^2 - 2w_2r_3w_3r_2 + w_1^2r_3^3 + w_3^2r_1^2 - 2w_1r_3w_3r_1 + w_1^2r_2^3 + w_2^2r_1^2 - 2w_1r_2w_2r_1\bigg] $$

As $\boldsymbol \omega $ is independent of $\boldsymbol r$ all $\frac{\partial \omega_k}{\partial r_k}$ terms are zero. As each $r_k$ is independent of every other $r_j$ where $j \neq k$ all $\frac{\partial r_k}{\partial r_j}$ terms are zero so we obtain $$\frac{d}{d\boldsymbol{r}} \bigg[w_2^2r_3^3 + w_3^2r_2^2 - 2w_2r_3w_3r_2 + w_1^2r_3^3 + w_3^2r_1^2 - 2w_1r_3w_3r_1 + w_1^2r_2^3 + w_2^2r_1^2 - 2w_1r_2w_2r_1\bigg] = \pmatrix{2(w_3^2 + w_2^2)r_1 - 2(r_3w_3 + r_2w_2)w_1 \\ 2(w_3^2 + w_1^2) - 2(r_3w_3+r_1w_1)w_2 \\ 2(w_2^2+w_1^2) - 2(r_2w_2 + r_3w_1)w_3} = -2 \boldsymbol{w}\times(\boldsymbol{w}\times\boldsymbol{r}) $$

Which is close to your answer bar a factor of 2.

0
On

@Kelechi Nze why do you have $r_3^3$ and also $r_2^3$ in your equation? Can you please give a more detailed demonstration of what you did there? I really don't understand how you made the matrix at the end of demonstration.