What is $\frac{d}{d\mathbf v} \left[ \nabla \cdot (\mathbf v\otimes \mathbf v ) \right]$?

95 Views Asked by At

Let $\mathbf v(\mathbf x): \mathbb R^3 \to \mathbb R^3$ with $\mathbf x \in \mathbb R^3$. I am trying to determine the following gradient:
$$ \frac{d}{d\mathbf v} \left[ \nabla \cdot (\mathbf v\otimes \mathbf v ) \right] $$


This is my approach using matrix calculus:

Using the well known identity: $\nabla \cdot (\mathbf {a} \otimes \mathbf {b} )=(\nabla \cdot \mathbf {a} )\mathbf {b} +\mathbf {a} \cdot \nabla \mathbf {b}$

I have that,

$$ \nabla \cdot (\mathbf v\otimes \mathbf v ) = (\nabla \cdot \mathbf v)\mathbf v + \mathbf v \cdot \nabla \mathbf v. $$

Then, $$ \frac{d}{d\mathbf v} \left( \mathbf v \cdot \nabla \mathbf v \right) = \mathbf v^T\frac{d}{d\mathbf v}(\nabla \mathbf v) + (\nabla \mathbf v)^T \frac{d\mathbf v}{d\mathbf v} $$ and using the fact that $\frac{d}{d\mathbf v}(\nabla \mathbf v) = 0$, $$ \frac{d}{d\mathbf v} \left( \mathbf v \cdot \nabla \mathbf v \right) = \nabla \mathbf v $$

Finally, $$ \frac{d}{d\mathbf v} \left((\nabla \cdot \mathbf v)\mathbf v\right) = (\nabla \cdot \mathbf v) \frac{d\mathbf v}{d\mathbf v} + \mathbf v \frac{d}{d\mathbf v} (\nabla \cdot \mathbf v) $$ now here I am not sure if $\frac{d}{d\mathbf v} (\nabla \cdot \mathbf v) =0$?! and from here on I'm stuck.

2

There are 2 best solutions below

0
On BEST ANSWER

It’s best to revert to index notation. I’ll be using Einstein’s summation notation. You want to calculate: $$ \frac{\partial}{\partial v_i}\partial_j (v_kv_j) $$

The issue is that the partial derivative with respect to $v$ is ambiguous. By default, the natural approach is to assume that the components of $v$ and its partial derivatives as independent. For a geometric picture check out the notion of jets. Formally, this gives: $$ \frac{\partial }{\partial v_i}v_j=\delta_{ij}v_i\\ \frac{\partial }{\partial v_i}\partial_jv_k=0 $$

Your preliminary result is: $$ \partial_j (v_kv_j) =(\partial_j v_k)v_j+ v_k (\partial_jv_j) $$ Applying the derivative gives: $$ \frac{\partial}{\partial v_i}\partial_j (v_kv_j)=\partial_iv_k+\delta_{ik}\partial_jv_j $$

Another approach is on the contrary to insist that $v$ and its derivatives are not independent. Intuitively, in the discrete setting, you would replace the derivatives by finite differences which gives an implicit dependence on which you have to apply the chain rule. Formally, this is like when you derive the Euler-Lagrange equations. You rather get: $$ \frac{d}{dv_i}F(v_j,\partial_kv_l)= \frac{\partial F}{\partial v_i}-\partial_j\frac{\partial F}{\partial (\partial_j v_i)} $$ In this case: $$ \frac{d}{d v_i}\partial_j (v_kv_j)=0 $$ since it is a total derivative.

Hope this helps.

0
On

$ \def\n{\nabla} \def\p{\partial} \def\a{{\mathbf f}} \def\v{{\mathbf v}} \def\g#1#2{\frac{\p #1}{\p #2}} \def\LR#1{\left(#1\right)} $Your first result is perfectly fine, but let's name it $\a$, retain explicit tensor products, and rearrange things so that all of the free $\v$ variables are on the RHS $$\eqalign{ \a &= \n\cdot\LR{\v\otimes\v} \\ &= \LR{\n\cdot\v}\otimes\v + \v\cdot\LR{\n\otimes\v} \qquad\qquad\\ &= \LR{\n\cdot\v}\otimes\v + \LR{\n\otimes\v}^T\cdot\v \\ }$$ The only other thing you need to know is that $\LR{\large\g\v\v}$ equals the identity matrix $$\eqalign{ \g\a\v &= \LR{\n\cdot\v}\otimes\LR{\g\v\v} + \LR{\n\otimes\v}^T\cdot\LR{\g\v\v} \\ &= \LR{\n\cdot\v}\otimes I \;+\; \LR{\n\otimes\v}^T \\ }$$ This result is typically written without explicit tensor products $$\eqalign{ \g\a\v &= \LR{\n\cdot\v}I + \LR{\n\v}^T \qquad\qquad\qquad\qquad \\ }$$ Or you could use index notation, as others have suggested $$\eqalign{ \def\d{\delta} f_i &= \p_j\LR{v_j v_i} \\ &= \LR{\p_jv_j}v_i + v_j\LR{\p_j v_i} \\ \g{f_i}{v_k} &= \LR{\p_jv_j}\d_{ik} + \d_{jk}\LR{\p_j v_i} \\ &= \LR{\p_jv_j}\d_{ik} + \LR{\p_k v_i} \\ }$$