While reading some Optimization papers (such as this one) I came across the term "ergodic convergence rate", but I have been unable to find its definition.
As an example, the linked paper talks about an algorithm being
"...convergent with an $\mathcal{O}(1/k)$ ergodic convergence rate, where $k$ denotes the iteration number."
Does this mean that if $x^*$ is the solution and $(x_k)$ is the main sequence produced by the algorithm, then $||x_k - x^*|| = \mathcal{O}(1/k)$?
Thank you very much.
Ergodic convergence rate specifies convergence rate of $\bar{x}_n = 1/n \sum_{k=1}^n x^{(k)}$ to optimal solution $x^{*}$. Non-ergodic convergence rate specifies convergence rate of $x^{(k)}$ to $x^{*}$.