I want to proof that the function
$$ U(\alpha_1, \ldots, \alpha_k) = \frac{R^2 + M^2 \sum_{i=1}^k \alpha_i^2}{2 \sum_{i=1}^k \alpha_i} $$
is convex, for $R, M \geq 0$ and $\alpha_i > 0$. This function is the bound of $f_\text{best}^{(k)} - f_\star$ of the subgradient method mentioned here, but the text just states that the function is convex and I'm having trouble proving it.
You can rewrite your function as $$U(\alpha) = \frac{R^2}{2} \cdot \frac{1}{\sum_{i=1}^k \alpha_i} + \frac{M^2}{2} \cdot \frac{\|\alpha\|_2^2}{\sum_{i=1}^k \alpha_i},\: \alpha \in \mathbb{R}^k,$$ and then prove convexity for $f(\alpha) = \frac{1}{\sum_{i=1}^k \alpha_i}$ and $g(\alpha) = \frac{\|\alpha\|_2^2}{\sum_{i=1}^k \alpha_i}$. For $f$ you can calculate the Hessian or use induction. For $g$, you can use this as pointed by @ErlonKelvim. Since multiplying by a positive constant preserves convexity and the sum of convex functions is again convex, the result follows.