Vector analysis. Del and dot products

132 Views Asked by At

I am trying to prove that $$\nabla(\mathbf{A} \cdot \mathbf{B}) = (\mathbf{A} \cdot \nabla)\mathbf{B} + (\mathbf{B} \cdot \nabla)\mathbf{A} + \mathbf{A} \times (\nabla \times \mathbf{B}) + \mathbf{B} \times (\nabla \times \mathbf{A})$$

I've gotten as far as $\nabla(\mathbf{A} \cdot \mathbf{B}) = \nabla A\cdot B+\nabla B\cdot A$, using subscript summation. I don't know how to proceed.

This is part of proving that

$$\frac{Dv}{Dt}=\frac{\partial v}{\partial t}+\nabla(\frac{v^2}{2})-v\times(\nabla\times v)$$

3

There are 3 best solutions below

0
On BEST ANSWER

Let's start with the right hand side, we have, considering the $i$th component \begin{align*} &\!\!\! [(A \cdot \nabla) B + (B \cdot \nabla)A + A \times (\nabla \times B) + B \times (\nabla \times A)]_i\\ &= \delta^{jk}A_j\partial_kB_i + \delta^{jk}B_j\partial_kA_i + \epsilon^{jk}_{\;\;i}A_j\epsilon^{\mu\nu}_{\;\;k}\partial_\mu B_\nu + \epsilon^{jk}_{\;\;i}B_j\epsilon^{\mu\nu}_{\;\;k}\partial_\mu A_\nu\\ &= \delta^{jk}A_j\partial_kB_i + \delta^{jk}B_j\partial_kA_i + (\delta^\mu_i\delta^{\nu j} - \delta^\nu_i \delta^{\mu j})(A_j\partial_\mu B_\nu + B_j\partial_\mu A_\nu)\\ &= \delta^{jk}A_j\partial_kB_i + \delta^{jk}B_j\partial_kA_i + \delta^{jk}(A_j\partial_i B_k + B_j\partial_i A_k) - \delta^{jk}(A_j\partial_k B_i + B_j\partial_k A_i)\\ &= \delta^{jk}(A_j\partial_i B_k + B_j\partial_i A_k)\\ &= \partial_i(\delta^{jk} A_j B_k)\\ &= [\nabla(A \cdot B)]_i \end{align*}

0
On

I always find for identities like these, its best to start from the RHS and get to the LHS - because its an equality, you can go either way you want. Starting with $\mathbf{A} \times (\nabla \times \mathbf{B})$, we see that as its i-th component is (where $\epsilon_{ijk}$ is the Levi-Civita tensor):

$$ [\mathbf{A} \times (\nabla \times \mathbf{B})]_i = \epsilon_{ijk}A_j[\nabla \times \mathbf{B}]_k = \epsilon_{ijk}\epsilon_{klm}A_j\partial_lB_m$$

where we use the notation

$$\partial_i := \frac{\partial}{\partial x_i}$$

and hence by the identity

$$ \epsilon_{kij}\epsilon_{klm} = \delta_{il}\delta_{jm}-\delta_{im}\delta_{jl}$$

we get that

$$ [\mathbf{A} \times (\nabla \times \mathbf{B})]_i = A_j\partial_iB_j - [(\mathbf{A} \cdot \nabla)\mathbf{B}]_i$$

where we realise that the first term on the RHS is half of what we need to get on the left hand side. Doing similarily to $[\mathbf{B} \times (\nabla \times \mathbf{A})]_i$ gives the result.

0
On

Following your effect of using subscript notation: $\newcommand{\b}{\boldsymbol}$ $$ \nabla (\b{A}\cdot \b{B}) = \nabla_{\b{A}}(\b{A}\cdot \b{B}) + \nabla_{\b{B}}(\b{A}\cdot \b{B}),\tag{1} $$

where $\nabla_{\b{A}}(\b{A}\cdot \b{B})$ is that $\b{A}$ is differentiated while $\b{B}$ is held constant. We can prove: $$ \nabla_{\b{A}}(\b{A}\cdot \b{B}) = (\b{B}\cdot \nabla) \b{A} + \b{B}\times(\nabla \times\b{A}).\tag{2} $$ This is a very geometrical identity in that, for $\b{B}$ fixed, the derviate of $\b{A}$ can be decomposed into the normal derivative and the tangential derivative along $\b{A}$: \begin{align} &\nabla_{\b{A}}(\b{A}\cdot \b{B}) = \begin{pmatrix}\partial_x A_1 &\partial_x A_2 & \partial_x A_3 \\ \partial_y A_1& \partial_y A_2 & \partial_y A_3 \\ \partial_z A_1 & \partial_z A_2 & \partial_z A_3 \end{pmatrix}\begin{pmatrix}B_1\\B_2 \\B_3\end{pmatrix} \\ =& \begin{pmatrix}\partial_x A_1 &\partial_y A_1 & \partial_z A_1 \\ \partial_x A_2& \partial_y A_2 & \partial_z A_2 \\ \partial_x A_3 & \partial_y A_3 & \partial_z A_3 \end{pmatrix}\begin{pmatrix}B_1\\B_2 \\B_3\end{pmatrix} \\ &+ \begin{pmatrix}0 & \partial_x A_2-\partial_y A_1 & \partial_x A_3 -\partial_z A_1 \\ \partial_y A_1 - \partial_x A_2& 0 & \partial_y A_3 - \partial_z A_2 \\ \partial_z A_1 - \partial_x A_3 & \partial_z A_2-\partial_y A_3 & 0 \end{pmatrix}\begin{pmatrix}B_1\\B_2 \\B_3\end{pmatrix} \\ =&(\b{B}\cdot \nabla) \b{A} + \b{B}\times(\nabla \times\b{A}), \end{align} extraction of the anti-symmetric part, for the cross product(wedge) can be written as the following anti-symmetric matrix form: $$ \begin{pmatrix} 0 & -\partial_z & \partial_y \\ \partial_z & 0 & -\partial_x \\ -\partial_y & \partial_x & 0 \end{pmatrix} \begin{pmatrix} A_1\\A_2\\A_3 \end{pmatrix} = \nabla \times \b{A}. $$

The rest is plugging (2) into (1), and do the same for the other term.