I am looking for a probabilistic analogue for the mean value theorem for random variables of the following type:
Let $X$ be a real valued random variable defined on a probability space $( \Omega, \mathcal{F}, P )$, $y \in \mathbb{R}$ and $f: \mathbb{R} \rightarrow \mathbb{R}$ be a a differentiable function then I would like that
$$E[f(X)] - f(y) = E[f'(C)] ( E[X] - y)$$
where $C$ is a random variable that always lies between $X$ and $y$. Of course this statement in general is not correct. I was wondering if imposing bounds on the first derivative of $f$, like $ 0<\ell < f'(x) < L \in \mathbb{R}$ and some other mild condition on $X$ one could recover that
$$E[f(X)] - f(y) = C ( E[X] - y) \tag{1}$$
where $C \in \mathbb{R}$ is such that $ \ell<C < L$.
Up until now the closest result to this I have found is given in this paper where for two random variables $X,Y$ that have a stochastic order (ones cumulative density function is strictly below the others cumulative density function) then, under some other mild conditions, one can obtain that
$$E[f(X)] - E[f(Y)] = E[f'(C)] ( E[X] - E[Y])$$
This is a lovely result but the stochastic ordering condition is really restrictive in practice, does anyone know of something a bit less restricting aching to the formula in $(1)$ ?