Although the question arose in a physics book (A. Messiah, Quantum Mechanics p. 939), it is purely a mathematical problem. I would like to evaluate the expression ($r=\sqrt{x^2+y^2+z^2}$) \begin{equation} \vec{B}=\nabla \times (\nabla \times \frac{\vec{m}}{r}), \end{equation} which arises for the magnetic field of a magnetic moment $\vec{m}$ (independent of $r$), at $r=0$. Messiah first uses a standard vector analysis relation to get \begin{equation} \nabla(\vec{m}\cdot\nabla)\frac{1}{r} - \vec{m}\Delta\frac{1}{r}, \end{equation} and then splits the terms as \begin{equation} -\tfrac{2}{3}\vec{m}\Delta\frac{1}{r} +[\nabla(\vec{m}\cdot\nabla) - \tfrac{1}{3}\vec{m}\Delta]\frac{1}{r}. \end{equation} Whereas the first term is well known from the Laplace equation, Messiah argues that the second term ($[...]\frac{1}{r}$) vanishes. He states that this can be seen if you integrate $[...]\frac{1}{r}$ multiplied with a test function $f(\vec{r})$ around a small domain around $r=0$, expand $f(\vec{r})$ in spherical harmonics and let $r\rightarrow 0$. However, I can really not follow how this works. Could somebody perhaps give an alternative explanation for this? Thank you very much!
P.S. the problem can be found here on p. 67 (please tell me if I am not allowed to link this).