How can $(A \times \nabla) \times B$ be rearranged?

250 Views Asked by At

Given 3 vector spaces $A$, $B$ & $C$, I am able to derive that $A \times (B \times C)$ is equal to $(A \cdot C)B-(A \cdot B)C$.

However, I do not think I can apply this identity to $(A \times \nabla) \times B$, because I would get: $$(A \times \nabla) \times B = (A \cdot B)\nabla - (A \cdot \nabla)B$$ Which is clearly wrong, since $(A \cdot \nabla)\nabla$ is not a vector space.

How would I go about this then?


Working with the Levi-Civita symbol yields:

$$\epsilon_{ijk}\epsilon_{jlm}A_l\partial_mB_k=A_k \partial_i B_k - A_i\partial_kB_k $$

Which doesn't seem to be particularly useful in this case.

2

There are 2 best solutions below

0
On BEST ANSWER

Here's an approach that I like. First, note that we can write $$ \nabla \times F = \sum_{i=1}^3 e_i \times \frac{\partial F}{\partial x_i} $$ where $e_1,e_2,e_3$ denote $\hat i, \hat j, \hat k$, and $$ \frac{\partial F}{\partial x_i} = \left(\frac{\partial F^{(x_1)}}{x_i},\frac{\partial F^{(x_2)}}{x_i}, \frac{\partial F^{(x_3)}}{x_i}\right). $$ With that in mind, we can write $$ \begin{align*} (A \times \nabla) \times B &= \sum_{i=1}^3 (A \times e_i) \times \frac{\partial B}{\partial x_i} \\ & = \sum_{i=1}^3 \left[\left(A \cdot \frac{\partial B}{\partial x_i}\right) e_i - \left (\frac{\partial B}{\partial x_i} \cdot e_i \right)A\right] \\ & = \sum_{i=1}^3 \left(A \cdot \frac{\partial B}{\partial x_i}\right) e_i - A\sum_{i=1}^3\frac{\partial B}{\partial x_i} \cdot e_i \\ & = \sum_{i=1}^3 \left(A \cdot \frac{\partial B}{\partial x_i}\right) e_i - A(\nabla \cdot B) \end{align*} $$ According to the other answer, it seems that the first term can be rewritten as $A \cdot (\nabla B)$.

0
On

We have (using Einstein summation notation): \begin{align*} (\mathbf{A}\times\nabla)_i&=\varepsilon_{ijk}A_j\partial_k \\ [(\mathbf{A}\times\nabla)\times \mathbf{B}]_{m}&=\varepsilon_{mil}\underbrace{\varepsilon_{ijk}A_j\partial_k}_{(\mathbf{A}\times\nabla)_i}B_l \\ &=-\varepsilon_{iml}\varepsilon_{ijk}A_j\partial_kB_l \\ &=-(\delta_{mj}\delta_{lk}-\delta_{mk}\delta_{lj})A_j\partial_kB_l \\ &=(\delta_{mk}\delta_{lj}-\delta_{mj}\delta_{lk})A_j\partial_kB_l \\ &=\delta_{mk}\delta_{lj}A_j\partial_kB_l-\delta_{mj}\delta_{lk}A_j\partial_kB_l \\ &=\delta_{mk}A_j\partial_kB_j-\delta_{mj}A_j\partial_kB_k \\ &=A_j\partial_mB_j-A_m\partial_kB_k, \end{align*} as you obtained (though with different dummy variables - an inconsequential difference). The final step is to recognize this in terms of vectors and the various products available to us (dot, scalar, cross). We have $$(\mathbf{A}\times\nabla)\times \mathbf{B}=\mathbf{A}\cdot(\nabla\mathbf{B})-\mathbf{A}(\nabla\cdot\mathbf{B}). $$ Now, the notation $\nabla\mathbf{B}$ requires some explanation. It's the Jacobian matrix $$\nabla\mathbf{B}=\left(\frac{\partial B_i}{\partial x_j}\right)_{ij}, $$ a second-rank tensor, so that the dot product $\mathbf{A}\cdot(\nabla\mathbf{B})$ is still a vector, as we need (the result should definitely be a vector!).