Integral of harmonic function in a ball

403 Views Asked by At

Let $f\in C^2(\Omega)$ an harmonic function in $\Omega$, and:

$$ \phi(r) = \frac{1}{2\alpha_2r} \int_{\partial B_r(x)} f(y) d \sigma(y) $$

Prove that $\phi '(r)=0$ by calculating the line integral.

Here, $\partial B_r(x)=\{x=(x_1,x_2)\in \mathbb{\mathbb{R}^2}:x^2_1+x^2_2=r^2\}$, then, to calculate the integral I can parameterize the curve as usual: $x(r)=x_1+r\cos \theta$, $y(r)=x_2+r\sin \theta$ but I don't know how to calcule the integral cause I don't know an explicit form of $f$. What am I missing? how can I use the fact $\Delta f=0$? Thank you.