compute the L^2-distance of a given function to the set of Gaussian functions

24 Views Asked by At

I am faced with the following question: given a probability density function $f$ over $\mathbb{R}$ with $\int_{\mathbb{R}}f(x)x^2dx=\sigma^2$ given, I am trying to find the "nearest" Gaussian to $\sqrt{f}$ in $L^2$, i.e. compute $$\inf_{a\in \mathbb{R}, b > 0}\|\sqrt{f}-G_{a,b}\|_2^2$$ where $G_{a,b}(x)=a\exp(-bx^2)$ and the $\|\cdot\|_2$ is the classical $L^2$-norm over $\mathbb{R}.$

If one replaces the $L^2$-distance by the relative entropy $H(f|G_{a,b})$, it is easy to check that the infimum is a minimum given by the centered gaussian with variance $\sigma^2.$ For the $L^2$-distance, it seems more involved......