Is there an efficient algorithm to project a give point $y \in \mathbb{R}^n$ onto the set $S := \{x \in \mathbb{R}^n : x \geq 0, x_1 + \dots + x_n \leq 1 \}$? In other words, can the minimum of $\| x - y\|_2^2$ with $x \in S$ can be found more efficiently than solving a quadratic program?
I've seen that there are efficient algorithms to project onto the probability simplex ($\Delta := \{x \in \mathbb{R}^n : x \geq 0, x_1 + \dots + x_n =1 \}$), as well as onto the $L_1$ unit ball (see e.g. http://machinelearning.org/archive/icml2008/papers/361.pdf). However, it seems to me that the problem above cannot be reduced to any of these.
Thank you very much!