I have to solve this problem:
$v=\partial_\phi$ on $M=\mathbb{R}^2\backslash{0}$ where the components of $v$ are in polar coordinates.
Calculate the divergence of $v$.
Even with the help of comments and answers I have problems solving this. Can someone give me a full solution? (I am not trying to dodge working by myself!)
We use the definition of divergence
$$\text{div } u = \frac{1}{\sqrt{g}} \frac{\partial}{\partial x^i} (u^i \sqrt{g})$$
where $g$ here means the determinant of the metric and summation over $i$ is implied.
Here, our vector field is $u = \partial_\phi$. More clearly, write it out in components, in terms of the basis vectors $\partial_r$ and $\partial_\phi$. That is
$$u = u^r \partial_r + u^\phi \partial_\phi$$
That is, $u^r = 0$ and $u^\phi = 1$. These are the numbers that will stand in for $u^i$ in the expression for divergence.
Finally, with $g = r^2$, we get
$$\text{div } u = \frac{1}{r} \left[ \frac{\partial}{\partial r} (0 \cdot r) + \frac{\partial}{\partial \phi} (1 \cdot r) \right] = 0$$
which is zero.