I have a learning sample $D_n = f(X_i, Y_i)_{i=1}^n$ where the $X_i$’s are $\mathbb{R}^d$-valued and the $Y_i$’s are $\{\pm 1\}$-valued.
$$ f(\theta) = \frac{1}{n} \sum_{i=1}^{n} \exp(-Y_i (\theta^T X_i)), $$
where $\theta \in [-B, +B]$ and $\theta \in \Theta$. I want to find the largest $\alpha > 0$ for which $f: \Theta \to \mathbb{R}$ is $\alpha$-strongly convex, where $\Theta =\{\theta \in \mathbb{R}^d: \|\theta\|_2 \leq B\}$.
Let $f_i(\theta) = \exp(-Y_i (\theta^T X_i))$. I need to show that "$f_i$ is $\alpha$-convex, $\forall i$" $\implies$ "$f$ is $\alpha$-convex".
Step I:
Proving that $f_i$ is $\alpha$-convex can be shown by calculating the Hessian as follows: $$ \nabla^2 f_i(\theta) \geq \alpha I $$ Where $I$ is the identity matrix.
Step II: Proving the RHS of the inequality above can be done by Linear combination of $\alpha$-convex functions is also $\alpha$-convex?!
Each of your $f_i$ is $\alpha$-convex. The combination you have is not just a linear combination, but a convex combination with non-negative coefficients adding to $1$. This is why the combination is also $\alpha$-convex. More precisely, $1/nf_i$ is $\alpha/n$-convex and then $f$ is $\alpha$-convex.
A detailed proof is as follows. By linearity of the second derivatives $$ \nabla^2\left(\frac{1}{n}f_i(\theta)\right)=\frac{1}{n}\nabla^2f_i\ge\frac{\alpha}{n}I $$ and then $$ \nabla^2\left(\sum\frac{1}{n}f_i(\theta)\right)=\sum\nabla^2\left(\frac{1}{n}f_i(\theta)\right)\ge n\frac{\alpha}{n}I=\alpha I. $$ So your sum is $\alpha$-convex.