Divergence on domain's boundary

33 Views Asked by At

As an example, the divergence of a 2D vector field $\mathbf{u}=(u,v)$ is defined as $u_x + v_y$. Consider $\mathbf{u}=(1,0)$, the divergence is zero. Now we modify this field by making those on the $-x$ half plane zero, meaning $\mathbf{u}=(0,0)$, then we have a piecewise vector field. It is easy to show that the divergence of this field is $\delta(x)$. For this purpose you can either directly apply $u_x+v_y$, or use the original definition by taking the limit of a surface integral divided by area.

My question is how to generalize this. For example, if instead of making vectors on the $-x$ half plane zero, what if we make those outside of the unit circle zero? Then we have $$ \mathbf{u} = (1,0) \quad \text{when} \sqrt{x^2+y^2}\le 1\,\quad\text{and}\quad(0,0)\,\text{otherwise} $$ What's the divergence on the unit circle? I guess the local normal direction should matter?

Lastly, what if there is "nothing" outside of the unit circle? Can divergence still be defined in this case?