I'd like to generate $N$ random variables who sum to 1 and are as close as possible to uniform $U(0,1)$. My specific use case is for $N=4$: I'm generating discrete probability distributions, but needed more density in the upper tail for our purposes. I've become more than a little curious about optimizing this problem and generalizing $N$.
What I've worked on so far is trying different distributions and cranking through all the calculus to get the Kullback-Leibler divergence $D_{KL}$ to $U(0,1), -\int_0^1\partial x log(f(x))$. If we start with $X_1,X_2,X_3,X_4$ $iid$ $U(0,1)$, and take $Z={X_1\over X_1+X_2+X_3+X_4}$, we get a $D_{KL}=1.83011$. If instead we force one of the numbers at random to be uniform and the others to sum to its complement, we get $Y={X_1\over X_1+X_2+X_3}(1-X_4)$, $D_{KL}=-\int_0^1 \partial x log(1/4+3/4f_Z(x))=0.445$ - a marked improvement. It is therefore possible to do better than simply normalizing the sum - but I'm curious as to how one would go about finding and proving the optimal generation method under this criterion, as of right now I'm limited to trial and error.