I am studying Mirror descent and nonlinear projected subgradient methods. At page 171 Theorem 4.1., the author claims that the method converges provided $$ \sum_s t_s= \infty , \,\,\,t_k \rightarrow 0 \,\,\,\,\,\text{as} \,\,\,\ k \rightarrow \infty $$ beacuse the right hand side of the following goes to zero:
$$ \min_{1\leq s \leq k} f(x^s) - \min_{x \in X} f(x) \leq \frac{B_{\psi(x^*,x^1)}+(2\sigma)^{-1}\sum_{s=1}^kt_s^2\|f'(x^k)\|_*^2}{\sum_{k=1}^s t_s} $$
My question is that how we know $\sum_{s=1}^kt_s^2\|f'(x^k)\|_*^2$ is bounded provided aforementioned assumption? Although $t_k \rightarrow 0$ does not guarantee that $\sum_{s=1}^kt_s^2\|f'(x^k)\|_*^2$ is bounded.
Section (b) of Assumption A states that $f$ is $L_f$-Lipschitz. A standard result of optimization asserts that if $f'(x)\in \partial f(x)$, then $\|f'(x)\|_*\leq L_f$ where $\|\cdot\|_*$ denotes the dual norm of $\|\cdot\|$.
Thus $\displaystyle \frac{\sum_{k=1}^n t_k^2\|f'(x^k)\|_*^2}{\sum_{k=1}^n t_k}\leq L_f\frac{\sum_{k=1}^n t_k^2}{\sum_{k=1}^n t_k}$.
Let us show that $\displaystyle \frac{\sum_{k=1}^n t_k^2}{\sum_{k=1}^n t_k} \to 0$. Remember that the $t_n$ are $\geq 0$.
Let $\epsilon >0$. There exists $N$ such that $n\geq N\implies t_n\leq \epsilon$. For $n\geq N$, $$\sum_{k=1}^n t_k^2\leq \sum_{k=1}^N t_k^2 + \epsilon \sum_{k=N+1}^n t_k$$ The sequence $\displaystyle \epsilon \sum_{k=N+1}^n t_k$ (indexed by $n$) diverges to $\infty$, so there exists some $N'>N$ such that $$n\geq N' \implies \sum_{k=1}^N t_k^2\leq\epsilon \sum_{k=N+1}^n t_k $$ For $n\geq N'$, $$\sum_{k=1}^n t_k^2\leq 2\epsilon \sum_{k=N+1}^n t_k\leq 2\epsilon \sum_{k=1}^n t_k$$ and we're done.