Computing divergence of a piecewise constant vector field

114 Views Asked by At

Let $n \in \mathbb R^2$ be a given vector of length 1 and let $U$ be the following vector field: \begin{equation} U(x) : = \begin{cases} U^+ & \text{ if } x \cdot n>0 \\ U^- & \text{ if } x \cdot n\le 0 \end{cases} \end{equation} where $U^\pm$ are two constant (different) vectors in $\mathbb R^2$.

How can I compute the divergence of $U$ (in the sense of distributions)? In particular, what does the condition $\text{div } U=0$ imply for $U^\pm, n$? I am lost in my computations, I have tried to work out several examples, but I fail to see some light about this easy exercise.

1

There are 1 best solutions below

0
On BEST ANSWER

the distributional divergence $\operatorname{div} U$ satisfies for every $\phi \in \mathcal D$, $$ (\operatorname{div} U, \phi )=-(U,\nabla \phi). $$ By direct computation, $$(U,\nabla \phi) = \int_{x\cdot n> 0} U^+ \cdot \nabla \phi + \int_{x\cdot n\le 0} U^- \cdot \nabla \phi $$

For any constant vector $U^\pm$, note that $U^\pm \cdot \nabla \phi =\nabla\cdot (U^\pm \phi).$ So by Divergence theorem, $$ (\operatorname{div} U, \phi ) = - \left( \int_{x\cdot n=0} \phi(x)d\sigma(x)\right) (U^+-U^-)\cdot n$$ or maybe I got a minus sign wrong...but this is the method.