The integral in question is
$$\int_0^\infty (f(x)-a)^2dx$$
Where f(x) is some continuous function and a is some constant.
When we expand the integrand,we end up with an $a^2$ term. We can then split up the integral to get:
$$\int_0^\infty [f(x)]^2dx +\int_0^\infty -2af(x)dx+\int_0^\infty a^2dx$$
Now we know that the third of the above integrals diverges, since it just becomes $a^2x$ (which tends to infinity as x increases).
Is this fact enough to demonstrate that the integral diverges? I highly suspect not but don't know for sure.
In order to split a sum of integrals, you need them to each converge to a finite number, or for there to be just one of $+\infty$ and $-\infty$ which all of the infinite ones converge to. Here this may not necessarily happen, because you can have, for instance, the case $f(x)=a$, in which case the original integral is zero but the sum is the indeterminate form $\infty - \infty + \infty$ (assuming $a \neq 0$).