Given $X$ a random variable that takes values on all of $\mathbb{R}$ with associated probability density function $f$ is it true that for all $r > 0$
$$E \left[ \int_{X-r}^{X+r} f(x) dx \right] \ge E \left[ \int_{X-r}^{X+r} g(x) dx \right]$$
for any other probability density function $g$ ?
This seems intuitively true to me and I imagine if it were to be true that it has been proven but I can't find a similar result on the standard textbooks, even a reference is welcome.
Taking the particular case of small $r$ ($r \to 0$) and continuous $f$, your inequality turns equivalent to
$$ \int f^2 \ge \int f g $$
with the restrictions $\int f = \int g = 1$ and $f\ge 0$, $g\ge 0$. This is clearly false. For a fixed $f$ we maximize $\int f g$, not by choosing $g=f$, but by choosing $g$ concentrated around the mode (maximum) of $f$.
Incidentally, your assertion has a simple interpretation: suppose I have to guess the value of a random variable $x$ with pdf $f$, so that I win if the absolute error $e=|x- \hat x|$ is less than $r$. If the inequality were true, then the conclusion would be that my best estrategy (in terms of expected win rate) is to make a random guess , by drawing my $\hat x$ as an independent random variable with the same density as $x$. But this is not true, the optimal guess is to choose a deterministic value, that which maximizes the respective integral; for small $r$ this is the mode of $f$ (maximum a posteriori).