This issue has bothered me for a long time now, and it just came up again. For instance, I am reading some notes where it says:
So they have this $(\dfrac{\kappa - 1}{\kappa + 1})^k$ factor, which turns into $1 - \dfrac{1}{k}$ for $\kappa$ large, but somehow all of this implies $O(\kappa \log(\dfrac{1}{\epsilon}))$ iteration complexity. I really cannot follow this argument, especially given that $\epsilon$ was never defined.
In other optimization related text, they conclude by saying "...therefore the algorithm converges sub-linearly or linearly". But they never define what these terms means.
Can someone please recommend an up-to-date treatment of step-size complexity of optimization algorithms such as gradient descent?

Three recent references that might be helpful:
Beck, Amir. Introduction to Nonlinear Optimization: Theory, Algorithms, and Applications with MATLAB. Society for Industrial and Applied Mathematics, 2014.
Bertsekas, Dimitri P., and Athena Scientific. Convex optimization algorithms. Belmont: Athena Scientific, 2015.
Nesterov, Yurii. Introductory lectures on convex optimization: A basic course. Vol. 87. Springer Science & Business Media, 2013.
Of these, the books by Beck and Bertsekas are much more accessible than the book by Nesterov.
It's not hard to briefly explain the $O(\kappa \log(\epsilon))$ iteration complexity from the inequality
$\| x_{k+1} - x^{*} \| \leq (1-1/\kappa) \| x_{k}-x^{*} \|$.
Let $D_{0}=\| x_{0} - x^{*} \|$, and suppose that we want to have $\| x_{n} - x^{*} \| \leq \epsilon$. We have
$\| x_{n} - x^{*} \| \leq (1-1/\kappa)^{n} D_{0} $
So, we need
$(1-1/\kappa)^{n} D_{0} \leq \epsilon$.
$\log \left( (1-1/\kappa)^{n} D_{0} \right) \leq \log(\epsilon)$.
$n \log(1-1/\kappa) \leq \log(\epsilon) - \log(D_{0})$.
Using the fact that $\log(1-1/\kappa) < -1/\kappa$, we have
$n (\frac{-1}{\kappa}) \leq \log(\epsilon)-\log(D_{0})$
$n \geq \kappa \left( \log(\epsilon)-\log(D_{0}) \right) $
$\kappa \log(D_{0})$ is just a constant, so there will be a constant $C$ (which depends on $D_{0}$) such that if
$n \geq C\kappa \log(\epsilon)$,
then
$\| x_{n} - x^{*} \| \leq \epsilon $.
You also need to understand what linear and sublinear convergence are. The sequence $x_{k}$ converges linearly to $x^{*}$ if
$\lim_{k \rightarrow \infty} \frac{\| x_{k+1}-x^{*}\|}{\| x_{k}-x^{*} \|}=c < 1$.
The sequence converges sublinearly to $x^{*}$ if $x_{k} \rightarrow x^{*}$ and
$\lim_{k \rightarrow \infty} \frac{\| x_{k+1}-x^{*}\|}{\| x_{k}-x^{*} \|}=1$.