I'm having an issue solving a stochastic approximation scheme. I have a finite set of time-dependent variables $$x^t=(x_1^t, x_2^t,..., x_n^t)$$ and I can express them as a system in the form
$$x_i^{t+1}=x_i^t+\gamma_i^t(f(x^t)+u_i^t)$$
My problem is that the value of $\gamma_i^t$ changes from one variable to the other in a way that can't be scaled (e.g. $\gamma_i^t=\frac{1}{a_i \cdot t + 1}$ with $a_i$ a constant). I'm quite convinced it converges anyway but I can't find any reference on why. In Kushner & Yin 2003, I could not find a section corresponding to what I'm looking for, but it may very well be because I don't have the right keyword. It would be a great help if you have any reference on why a system with different time-steps converge!
EDIT: Let me probably add that contrary to two-time scale environments where one would have $\gamma_i^t/\gamma_j^t \to 0$, my step sizes are of the same order and the ratio converges to a constant.
Nevermind the question, theorem 2.3 chapter 5 of Kushner & Yin 2003 provides the answer.