I am trying prove the following inequality:
Suppose $f(x)$ is a symmetric distribution about 0 (e.g. standard normal distribution), then:
$\int f^2(x)dx \geq \int f(x-a)f(x+a)dx$ for any real $a$.
My guess is it should hold, and it does on normal distribution, but I don't know how to prove it in a more general form using only properties of symmetric functions.
I don't even think you need symmetricity. By Cauchy Schwartz,
$$\int f(x+a)f(x-a) dx \leq \bigg( \int f(x+a)^2 dx \bigg)^{1/2}\bigg( \int f(x-a)^2 dx \bigg)^{1/2} $$$$ =\bigg( \int f(x)^2 dx \bigg)^{1/2}\bigg( \int f(x)^2 dx \bigg)^{1/2} = \int f(x)^2 dx$$