While looking for some nice integrals that are not taught in school, I found this theorem:
Suppose $f$ is a bivariate harmonic function; $(a,b)$ is a point in the plane; and $r$ is a positive real number. Then, $$ \int^{2\pi}_{0} f(a+ r \cos \theta, b+r\sin \theta)d\theta=2\pi f(a,b) $$
Is there a nice way to prove the theorem above?
Unfortunately I have no idea how to start proving this, but I suspect it's related to Euler's formula since there is an example there that can be solved that way.
Source $\longrightarrow$ Integration Tricks | Brilliant Math & Science Wiki
Start by writing $f$ as a function of a complex variable in the usual way, so the claimed result is
$$\int_0^{2\pi} d\theta \,f\bigl(z_0+re^{i\theta}\bigr)=2\pi\,f(z_0)$$
with $z_0:=a+ib$. Expand the integrand as a Taylor series, viz.
$$\sum_{n\ge 0}\frac{f^{(n)}(z_0)}{n!}r^n\int_0^{2\pi} d\theta \,e^{in\theta}.$$
The required result follows from $\int_0^{2\pi} d\theta\, e^{in\theta}=2\pi\delta_{n0}$.