Assuming we have a optimization method with convergence properties such as $\nabla f(x_k) \to 0$ for minimizing $f$ (or at least a subsequence or whatever, not important for me). I need to modify the method such that after every iteration, a "random" move is made that only guarantees $$f(x_{k+1}) \leq f(x_k)$$ but no further conditions for this update.
What kind of conditions on the optimization method do I need so that I don't destroy the convergence properties? Is there any literature about that?