I am trying to prove if an (seemingly) intuitive inequality involving the expectations of a Gaussian random variable and a scaled version holds. From numerical implementation, I think that the inequality holds.
Say we have two functions $f(x):\mathbb R\mapsto\mathbb R_+ $ and $g(x): \mathbb R \mapsto\mathbb R_+ $, for which we know that $ g(x) \ge f(x)$ for all $x$ and that both functions are non-decreasing in $x$. Let $\alpha\in (0,1)$, is it then true that: $$\int_{-\infty}^{\infty}g(x)\varphi(x | \mu, \sigma^2) dx \geq \int_{-\infty}^{\infty}f(x)\varphi(x | \alpha \cdot \mu, \alpha \cdot \sigma^2) dx,$$ where $\varphi(x | \mu, \sigma^2)$ denotes the PDF of a Gaussian random variable with mean $\mu$ and variance $\sigma^2$.
Note that both terms are expecations with respect to $g(x)$ and $f(x)$, respectively, where on the right hand side, the Gaussian rv is smaller in size and variance (both scaled by $\alpha$). I know that the inequality holds if I would only scale the $\mu$ by $\alpha\in(0,1)$ on the right hand side and keep $\sigma^2$ fixed on both sides by standard probability theory. However in this case, $\sigma^2$ gets scaled down by the same factor and then it is much harder to compare the random variables, and thus the expectations.
Thanks in advance!