I need to solve the quadratic programming problem $$ \text{minimize}\,\, \sum_{j=1}^{n}(x_{j})^{2} \\ \text{subject to}\,\,\, \sum_{j=1}^{n}x_{j}=1,\\ 0 \leq x_{j}\leq u_{j}, \, \, j=1,\cdots , n $$
I know that the first thing I need to do is form the Lagrangian.
Now, for a problem in standard form (note that below, $\overline{x}$, $\overline{\lambda}$, $\overline{\mu}$ denote vectors): $$ \text{minimize} \, \, f_{0}(\overline{x}) \\ \text{subject to} \,\,\, f_{i}(\overline{x}), \,\,\, i=1,\cdots, m \\ h_{i}(\overline{x}), \,\,\, i = 1,\cdots, p $$ the Lagrangian looks like this: $\displaystyle L(\overline{x},\overline{\lambda}, \overline{\mu}) = f_{0}(x) + \sum_{i=1}^{m}\lambda_{i}f_{i}(\overline{x}) + \sum_{i=1}^{p}\mu_{i}h_{i}(\overline{x})$
In this case, I am being thrown off by the fact that my sole $h_{i}(\overline{x})$ happens to be a sum that adds up to $1$, and if I want my $f_{i}(\overline{x})$'s to be $\leq 0$, I'm going to need to rewrite the last line of constraints as $x_{j} - u_{j} \leq 0$, $j = 1,\cdots , n$ and $-x_{j} \leq 0$, $j = 1, \cdots, n$.
Then, would my Lagrangian be $\displaystyle L(\overline{x},\overline{\lambda}, \overline{\mu}) = \sum_{j=1}^{n}(x_{j})^{2} + \sum_{j=1}^{n}\lambda_{i}(x_{j}-u_{j}) + \sum_{j=1}^{n}\nu_{i} (-x_{i}) + \mu\left[\left(\sum_{j=1}^{n}x_{j} \right)-1\right]$ ?
And then, how would I go about completing the problem? I've never done a problem with this many Lagrange variables in it before, nor with this many constraints, and so I'm finding it a little overwhelming...
Thank you ahead of time for your time and patience!
Basic Variational Approach
Since $$ \sum_{j=1}^nx_j=1\tag1 $$ any variation of the $x_j$'s must satisfy $$ \sum_{j=1}^n\delta x_j=0\tag2 $$ At an interior critical point of $$ \sum_{j=1}^nx_j^2\tag3 $$ we will have $$ \sum_{j=1}^n2x_j\delta x_j=0\tag4 $$ At an interior critical point, any change that maintains $(1)$ should not change $(3)$. That is, for any $\delta x_j$ that satisfies $(2)$, $\delta x_j$ should satisfy $(4)$.
Note that $(2)$ says that $(\delta x_1,\delta x_2, \delta x_3,\dots,\delta x_n)$ is perpendicular to $(1,1,1,\dots,1)$, and that is the only restriction on $\delta x_j$, unless $x_j=0$ or $x_j=u_j$ (the edge cases). Furthermore, $(4)$ is satisfied when $(\delta x_1,\delta x_2, \delta x_3,\dots,\delta x_n)$ is perpendicular to $(x_1,x_2,x_3,\dots,x_n)$. This means that any $(\delta x_j)$ that is perpendicular to $(1,1,1,\dots,1)$ is perpendicular to $(x_1,x_2,x_3,\dots,x_n)$. That is, $(1,1,1,\dots,1)$ is parallel to $(x_1,x_2,x_3,\dots,x_n)$.
Thus, the only interior critical points happen when $$ x_1=x_2=x_3=\dots=x_n=\lambda\tag5 $$ In light of $(1)$, this means that $$ (x_1,x_2,x_3,\dots,x_n)=\tfrac1n\left(1,1,1,\dots,1\right)\tag6 $$ We also need to check the edge cases where some $x_j=0$ or some $x_j=u_j$. In those cases, we still have the analog of $(5)$ for the interior $x_j$; that is, those for which $0\lt x_j\lt u_j$.
Lagrangian Approach
The Lagrangian would be $$ \mathcal{L}(x_1,x_2,x_3,\dots,x_n,\lambda)=\sum_{j=1}^nx_j^2-\lambda\left(\sum_{j=1}^nx_j-1\right)\tag7 $$ Taking the gradient this locates the interior critical points $$ \begin{align} 0 &=\nabla\mathcal{L}(x_1,x_2,x_3,\dots,x_n,\lambda)\\ &=\left(2x_1-\lambda,2x_2-\lambda,2x_3-\lambda,\dots,2x_n-\lambda,\sum_{j=1}^nx_j-1\right)\tag8 \end{align} $$ which we can solve to get $(6)$.
There are $2n$ $n-1$ dimensional edges, where $x_j=0$ and $x_j=u_j$, and a number of corners, etc. that need to be considered separately. They are not handled in the $n$-dimensional Lagrangian, though we can consider separate $n-1$ dimensional Lagrangians.