I have to prove this theorem but monstrous computation appear when i try to compute the laplacian of $\hat{u}$. Does anyone know an easier way for the proof?
Let be $\Omega$ an open subset in $\mathbb{R}^{n}$ and $u:\Omega \to \mathbb{R}$; \begin{equation} \Omega_{1}=\left\{\frac{x}{|x|^{2}} \quad | \quad x \in \Omega, \enspace x \neq 0\right\} \quad \quad \hat{u}(x)= |x|^{2-n} u\left(\frac{x}{|x|^{2}}\right) \quad \forall x \in \Omega_{1} \end{equation} Verify that $u$ is harmonic in $\Omega$ if and only if $\hat{u}$ is harmonic in $\Omega_{1}$
By-hand computation is the only way.
The transformation $u\mapsto \hat{u}$ is called the Kelvin transform. It is a very special transformation available on $\mathbb{R}^n$. In fact, when $n \geq 3$, for transformations of the type
$$ u(x) \mapsto u_{w,\phi}(x) = w(x) u(\phi(x)) $$
where $w: \mathbb{R}^n \to \mathbb{R}$ and $\phi: \mathbb{R}^n \to \mathbb{R}^n$ (domain possibly minus a point), it is a theorem that the only non-trivial such transformation that preserves harmonicity is the Kelvin transform. So you cannot expect this property of the Kelvin transform to fall out from some general theorem concerning a larger class of transformations.