Let $f\in\mathscr{C}^2(\mathbb{R},\mathbb{R})$ a strictly increasing function, striclty convex on $(-\infty,0)$, strictly concave on $(0,\infty)$ and let $\sigma_1>\sigma_2>0$ be two real numbers. Suppose that the function $g$ defined as:
$$g_{\sigma_1,\sigma_2}(x) \triangleq\int_\mathbb{R} (f(\sigma_1 s + x)-f(\sigma_2 s + x))e^{-s^2/2}ds$$
has only one zero in $\mathbb{R}$. Formally: $~\exists! x^*\in\mathbb{R},~g(x^*) = 0$.
I am numericaly quite sure that then, for all $\sigma_1', \sigma_2' >0$, the function $g_{\sigma_1',\sigma_2'}$ has only one zero in $\mathbb{R}$ (the zero location might obviously be different).
I am also very interested in weaker version of (1), if $\alpha>0$, the function $g_{\alpha\sigma_1,\alpha\sigma_2}$ has a unique zero.
I don't know how to prove this kind of "invariance" property...
Possible hints
This question arises from signal processing. The function $g$ can be seen has a $\text{DoG}$ function (Difference of Gaussian).
Any hints, solution or even counter-examples will be highly appreciated. Thank you very much!