Given that $u(x,y) \in C^2(\mathbb {R}^2)$, prove that for any circle$C(r) = \{(x,y)|(x-x_0)^2+(y-y_0)^2=r^2\}$, $\int_{C(r)} \frac{\partial u}{\partial \vec n} ds=0$ is equivalent to $\Delta u=0$.
I don't really understand whether I should prove that $\Delta u=0$ holds on the circle or in the circle.
Big Hint
Let $B(r)=\{(x,y)\mid (x-x_0)^2+(y-y_0)^2<r\}$. Using divergence theorem\begin{align*} 0&=\int_{C(r)}\nabla u\cdot \,\mathrm d \sigma \\ &=\int_{B(r)}\text{div}(\nabla u)\\ &=\int_{B(r)}\Delta u, \end{align*} for all $\mathbb R$. From this, I let you conclude that $\Delta u=0$.