How to apply divergence theorem to transform 2D integral to 1D integral?

499 Views Asked by At

I am trying to understand Medina and Jones's paper, "Leading-edge vortex burst on a low-aspect-ratio rotating flat plate"(2016).

They derived an equation in the appendix of this paper. They said divergence theorem was used in plane to get the relationship below.

$$\int_A\nabla_{2D} \cdot(\vec u_{2D}\omega _z)dA=\int_S \vec n_{2D}\cdot \vec u_{2D}\omega_{z}dS$$ $$\text{where} \quad \nabla_{2D}=(\partial/\partial x , \partial/\partial y)\quad \text{and} \quad \vec u_{2D}=(u,v)$$ $$A\text{ is an area pointing in the $z$ direction, bounded by perimeter $S$}$$

As far as I know divergence theorem transforms 3D to 2D or vice versa. Is there a way to apply divergence theorem to transform 2D to 1D?

1

There are 1 best solutions below

0
On BEST ANSWER

Here is a general version of the "divergence theorem":

$$ \int_U \nabla\cdot F\ dV=\int_{\partial U}F\cdot\nu \ dS $$ where $U\subset\mathbb{R}^n$ is an open set with "nice" boundary and $F$ is a smooth function. When $n=2$, this is related to Green's theorem.