Let $S_1=\{\lambda \mathbf d_1:\lambda\ge 0\}$ and $S_2=\{\lambda \mathbf d_2:\lambda\ge 0\},$ where $\mathbf d_1,\mathbf d_2$ are non-zero vectors in $\mathbb R^n.$
How to prove that $S_1\oplus S_2=\{\lambda_1d_1+\lambda_2d_2:\lambda_1d_1\in S_1,\lambda_2d_2\in S_2\}$ is closed?
Can someone give me a hint please?
Thanks you
Here is a slightly more general result:
Suppose $D= \{ d_k \}_{k=1}^l$ is a finite set of points. Then $K = \operatorname{cone} D = \{ \sum_k \lambda_k d_k | \lambda_k \ge 0\} $ is closed.
This result is a standard result, see, for example, Dem'yanov & Malozemov, "Introduction to minimax", Appendix II, Lemma 2.6.
Suppose $p_k \in K$ and $p_k \to p$. We would like to show that $p \in K$. It is clear that $0 \in K$ hence if $p =0$ we are finished, so we can assume $p \neq 0$.
We can write each $p_k = \sum_i \lambda_i d_i$ with $\lambda_i \ge 0$.
The key result here is that for each $p_k \in K$ we can write $p_k = \sum_{i \in I_k} \lambda_i(k) d_i$ where $I_k \subset \{ 1,...,l\}$ is non empty, the $\{ d_i \}_{i \in I_k}$ are linearly independent and $\lambda_i(k) \ge 0$ for $i \in I_k$.
Since the number of subsets of $\{ 1,...,l\}$ is finite, at least one of the subsets $I_k$ occurs infinitely often, call this $I$. Then we have $p_{k_n} = \sum_{i \in I} \lambda_i(k_n) d_i$ along this subsequence.
Since the index set is fixed, we can let $\Delta$ be the matrix whose columns are formed from the $\{ d_i \}_{i \in I}$ and write $p_{k_n} = \Delta \lambda(k_n)$, where $\lambda(k_n) \ge 0$ is a vector of suitable dimension.
Since the columns are linearly independent, there is some $\alpha>0$ such that $\|D \lambda\| \ge \alpha \|\lambda \|$, and since $p_{k_n} \to p$ we see that the $\lambda(k_n)$ are bounded and hence converge along a further subsequence to some $\lambda \ge 0$. Since $p = \Delta \lambda$, we see that $p \in K$.