In machine learning using neural networks, sigmoid functions are often used to mimic the dynamics of biological neuron's plasticity. Often sigmoids with a smooth curve is desired, the properties strictly demanded on these functions are usually:
$$f(x+x_0) > f(x), \forall x \in \mathbb R\\\lim_{x\to -\infty} f(x) = -1\\\lim_{x\to \infty} f(x) = 1$$
However as long as we have $f' = \frac{\partial f}{\partial x} \geq 0, \forall x\in \mathbb R$ this should still be fulfilled.
So assuming we have a pair $f, f'$ fulfilling the above constraints, are there any general methods we can apply to periodically (over $t$) "slow down" the learning,
Own work
So far, I have been thinking of, for example something like:
$${f_1}'(t) = (\sin(kx)^{2n}+\epsilon){f_1}'(t)\cdot$$
It is obvious to me that it will satisfy requirements above, maybe with some renormalization of the integral of $f'$ requrired. But in general it seems like a difficult modification because of the difficulty of doing analytic integration. Is there maybe some smarter modification we can do to a general $f$ or $f'$ which always gives an $f'$ for which $f$ is easy to calculate and always satisfies the conditions above?
Without getting to differential equations, you could try with something like \begin{equation} f(x)=\sigma(k(wx+\sin(wx))) \end{equation} where $k,w$ are constants and $\sigma$ is your "non-wobbly" sigmoid.
I think there is no "correct" answer as the function you are searching for has a too broad definition...