Suppose $X$ is a random variable with full support on the interval $[-1,1]$, with $\mathbb{E}[X]=0$. (You may assume the existence of a p.d.f. for $X$ if it helps.)
For any $f:[-1,1]\rightarrow\mathbb{R}$, we define $\|f\|_\infty = \sup_{x\in[-1,1]}{|f(x)|}$.
I would like to prove that there exists a constant $C>0$ such that for all sufficiently small $\sigma\in(0,1]$, and for all $g,h:[-1,1]\to\mathbb{R}$, bounded and Lipschitz continuous, with $g(0)=h(0)=0$ and with sufficiently small $\|g-h\|_{\infty}>0$:
$$|\mathbb{E}[g(\sigma X)-h(\sigma X)]|\le C \sigma \|g-h\|_{\infty}.$$
Without the $\sigma$ on the right hand side, this inequality would follow immediately from boundedness. Were the $\|g-h\|_{\infty}$ replaced with the Lipschitz constant of $g-h$ then this would follow from $g$ and $h$ being Lipschitz.
Is the claim above true as it stands? If not, are there non-trivial additional assumptions under which it holds? For example, it holds if $g$ and $h$ are quadratic functions (see below), which is suggestive of smoothness helping.
Additional conditions may be placed on $X$ or on the set from which $g$ and $h$ are taken. Conditions should not be placed directly on $g-h$.
Proof in the quadratic case:
Suppose $g(x)-h(x)=a x^2 + b x$, which holds if $g$ and $h$ are each quadratic. (The intercept must be $0$ as $g(0)-h(0)=0$.)
Then: $$|\mathbb{E}[g(\sigma X)-h(\sigma X)]|=\sigma^2 |a| \mathbb{E}[X^2].$$
Note also that: \begin{gather*} \|g-h\|_\infty\ge \max{\{|g(1)-h(1)|,|g(-1)-h(-1)|\}}=\max{\{|a+b|,|a-b|\}}\\ =\max{\{a+b,-(a+b),a-b,-(a-b)\}}=|a|+|b|\ge |a|. \end{gather*}
Thus as $\sigma\le 1$: $$|\mathbb{E}[g(\sigma X)-h(\sigma X)]|\le \mathbb{E}[X^2] \sigma \|g-h\|_\infty,$$ as required (with $C=\mathbb{E}[X^2]$).
Note that we could have proven something even stronger, with $\sigma^2$ not $\sigma$. This makes me optimistic that this may hold more generally.
$\def\d{\mathrm{d}}\def\R{\mathbb{R}}\def\paren#1{\left(#1\right)}$The proposition is not true. It will be proved that there exists a random variable $X$ such that $X \in [-1, 1]$, $E(X) = 0$, and for any $C > 0$, $0 < σ < 1$, $ε > 0$, there exists $0 < σ_0 < σ$ and a Lipschitzian $f: [-1, 1] → \R$ such that $f(0) = 0$, $0 < \|f\|_∞ < ε$, but$$ |E(f(σ_0 X))| > Cσ_0 \|f\|_∞. $$
Take $X \sim U(-1, 1)$. For any $C > 0$, $0 < σ < 1$, $ε > 0$, take$$ σ_0 = \min\paren{ \frac{1}{3C}, \frac{σ}{2} },\quad f(x) = \frac{ε}{2} \min(3C|x|, 1). $$ Then $0 < σ_0 < σ$, $f(0) = 0$, $0 < \|f\|_∞ = \dfrac{ε}{2} < ε$, but$$ |E(f(σ_0 X))| = \int_{-1}^1 \frac{ε}{2} \min(3Cσ_0 |x|, 1) · \frac{1}{2} \,\d x = \frac{ε}{4} \int_{-1}^1 3Cσ_0 |x| \,\d x = \frac{3}{4} Cεσ_0 > Cσ_0 \|f\|_∞. $$
Remark: Such estimates fail because $f$, though being Lipschitzian, can be arbitrarily steep near $0$. Thus in order to get a working estimate, the constant $C$ has to depend on $f$.