I'm computing the convergence rate of an optimization algorithm. The vector of the update of the parameters is $\varepsilon_{j}^{k}$ ,where $j$ indicates the vector element and $k$ the iteration index. I've found that $\frac{\varepsilon_{j}^{k+1}}{\varepsilon_{j}^{k}}=1-\frac{\alpha}{\varepsilon_{j}^{k}}\frac{1}{N}\sum_{\xi}\varepsilon_{\xi}^{k}$
To actually have a number for the convergence rate I'd like to be able to make it without that sum so that I can actually get a number and not an expression. I happen to know that the elements of $\varepsilon$ are both positive and negative and their values are independent. So $\sum_{\xi}\varepsilon_{\xi}\approx0$ . If the first is true then also $\sum_{\xi;\;\xi\neq j}\varepsilon_{\xi}\approx0$ stands and, if the latter is true then $\sum_{\xi}\varepsilon_{\xi}\approx\varepsilon_{j}$. I'd truly like to do this, but it doesn't sound right. If I were able to do this, my original expression would simplify to $\frac{\varepsilon_{j}^{k+1}}{\varepsilon_{j}^{k}}=1-\frac{\alpha}{\varepsilon_{j}^{k}}\frac{1}{N}\sum_{\xi}\varepsilon_{\xi}^{k}\approx1-\frac{\alpha}{N}\frac{\varepsilon_{j}}{\varepsilon_{j}}=1-\frac{\alpha}{N}$, which is finally a number that's easy to interpret as a convergence rate.