Convex Combination Generalized to Infinite Sums

1.3k Views Asked by At

I'm independently studying Boyd & Vandenberghe's Convex Optimization and came across the following statement discussing convex combinations of infinite terms.

The idea of a convex combination can be generalized to include infinite sums, integrals, and, in the most general form, probability distributions. Suppose $\theta_1, \theta_2, \dots$ satisfy

$$\begin{align} \theta_i &\geq 0 \\ i &= 1, 2, \dots \\ \sum_{i=1}^{\infty} \theta_i &= 1 \end{align}$$

and $x_1, x_2, \dots \in C$, where $C \subseteq \mathbb{R}^n$ is convex. Then

$$\sum_{i=1}^{\infty} \theta_i x_i \in C$$ if the series converges.

My first question is$\dots$ what mathematical concept or proof enables us to generalize the definition of convex combination from a finite $k$ to $k = \infty$? Previously in the book, the definition of convex combination was simply introduced as

A point of the form $\theta_1 x_1 + \dots + \theta_k x_k$, where $\theta_1 + \dots + \theta_k = 1$ and $\theta_i \geq 0$, $i = 1, \dots, k$ is a convex combination of the points $x_1, \dots, x_k$.

My second question is$\dots$ why is it necessary that the series converge? Shouldn't the fact that $C$ is convex guarantee that the infinite sum $\sum_{i=1}^{\infty} \theta_i x_i \in C$ since $C$ is a convex set and convex sets are closed under convex combination?

Thank you for taking the time to read this lengthy question!