I am reading a book about Euler equation and at some point we aim an extension criterion and I struggle with a point in the proof. At some point we reach the following integral $$A \equiv\int_{|y| \ge \delta} \frac{|u (x + y)|}{|y|^3} dy$$ and the author claims that, for $u \in C^0([0,T); H^s(\mathbb R^3))$ ($s > d/2 + 1 = 5/2$, $d$ is the dimension) a strong solution of the 3D Euler equations, $$A \le C(|\log \delta| \|u\|_{L^\infty} + \|u\|_{L^2}).$$ Note that $H^s$ embed in $L^\infty$ so that the norms above are well-defined.
I really don't get he gets this inequality. This $\log \delta $ seems to come from use of the polar coordinates, so that $$A \le C\|u\|_{L^\infty} \int_{\delta}^\infty \frac{1}{r}dr = C\|u\|_{L^\infty}(\log (\infty) - \log \delta).$$ But how am I supposed to get rid of this $\log \infty$ and from where comes from this $L^2$-norm ?