For a 'reasonable' pdf, $\rho(x)$ ($\mathbb{P}(A)=\int_A\rho(x)dx$), I am trying to prove the above inequality. In general, this inequality isn't true since we may take $\rho$ to have support only where $$x^2\exp(-x^2/2\sigma^2)>\sigma^2\exp(-x^2/2\sigma^2)$$ however, since both go to zero so rapidly, it seems possible that something like for every $\varepsilon>0$ (independent of sigma) that one has $$\int x^2\exp(-x^2/2\sigma^2)\rho(x)dx\leq C\sigma^2 \int\exp(-x^2/2\sigma^2)\rho(x)dx+\varepsilon.$$
I am okay if any decay/moment assumptions are made on $\rho(x)$. Can anybody provide some insight?
Edit: I realize a more accurate thing to ask is: $$\int x^2\exp(-x^2/2\sigma^2)\rho(x)dx\leq \mathbb{E}_\rho[X^n]\sigma^2 \int\exp(-x^2/2\sigma^2)\rho(x)dx?$$
where $\mathbb{E}_\rho[X^n]$ denotes the $n$th moment of $\rho.$