Given a set of $n$ two-dimensional unit vectors: $\left\{ \mathbf{v}_1, \dots, \mathbf{v}_n \right\}$, I want to find the coefficients $\left\{ \alpha_1, \dots, \alpha_n \right\}$, $0 \leq \alpha_i \leq 1$, that maximize the norm of the resultant vector, i.e. $||\sum_i \alpha_i \mathbf{v}_i||$.
EDIT
I have managed to show that each $\alpha_i$ must be either $0$ or $1$ (by showing that if $\alpha_i \neq 0$ in the optimal solution, then the norm of the resultant vector is strictly increasing in $\alpha_i$). The problem therefore reduces to finding the optimal combination of $\alpha_i$'s from $2^n$ possibilities.
However, I believe that the search could be narrowed down much further (at least in two dimensions). My intuition tells me that the subset of vectors that feature in the optimal solution must lie in a half plane. The search would then become $\mathcal{O} (n^2)$.
Alas, a proof eludes me.
You already know all coefficients are either 0 or 1 based on some answers and your own observations. So to get an $O(n^2)$ time solution, it suffices to prove that the vectors in the optimal solution are contiguous when the original set of vectors are ordered according to angle circularly. For this first we prove a lemma.
Lemma: The set of vectors in an optimal solution using a minimal number of vectors must all lie in an open half-plane.
Proof: Suppose not. Then it is easy to see that there is a non-trivial non-negative combination of the vectors in the optimal solution $\sum_i c_i v_i$ that equals zero. Whichever $c_i$ is the largest, say $c_1$, we have that vector $v_1$ will effectively be cancelled out in the optimal solution, and the remaining vectors will effectively have either equal or lower coefficients. Thus by leaving out $v_1$ we can get an optimal solution using a fewer number of vectors, a contradiction. Thus all vectors in an optimal solution must lie in a common half-plane if the optimal solution uses a minimal number of vectors.
Using the lemma, we can prove the optimal solution using the minimal number of vectors is a contiguous set of vectors. To see this, suppose we have an optimal solution with minimal number of vectors where the set of vectors is not contiguous. Then there are two vectors $v_1,v_2$ in the optimal solution, and a third vector $w$ in between them which is not included in the optimal solution. Let $z$ be the summation vector for the optimal solution. Then depending on which side of $w$ that $z$ is on, either $w$ is closer to $z$ than $v_1$ (in the sense of angle) or closer to $z$ than $v_2$ (because $z,v_1,v_2,w$ all lie in a common open hyperplane). Suppose it is $v_1$. Then consider $z - v_1$. We know that adding $v_1$ improves the solution, so the derivative with respect to $\alpha$ becomes positive for some $\alpha < 1$ for the length of the vector $(z-v_1) + \alpha v_1$. But $w$ is closer to $z$ than $v_1$ is, so adding $w$ to the optimal solution $z$ is certainly better than adding a multiple of $v_1$ which is the same length as $w$, and this is equivalent to increasing $\alpha$ to a value greater than 1. But we know the derivative of the length of $(z- v_1) + \alpha v_1$ is positive for $\alpha > 1$ because the derivative is increasing. Thus adding $w$ improves the optimal solution even further, a contradiction. So $w$ must be in the optimal solution, and hence the vectors in the optimal solution using a minimal number of vectors are contiguous when ordered according to angle circularly.