Squared L2 optimization with sum of the vector's elements of 1

46 Views Asked by At

Let $x$ be an $n$-dimensional vector with sum of its elements being $1$: $$\sum\limits_{i=1}^n x_i=1$$ How can the squared L2 norm of $x$ be optimized? I believe the solution is $\forall i\ \ x_i=\frac{1}{n}$, but I do not follow how to. prove this is optimal formally

1

There are 1 best solutions below

0
On BEST ANSWER

The problem at hand is $\inf_{\mathbf{1}^\top x = 1} x^\top x$, where $\mathbf{1}$ is the $n$-vector of all ones. The Lagrangian associated with this problem is \begin{equation*} L(x,\nu) = x^\top x + \nu(\mathbf{1}^\top x - 1). \end{equation*} This is a convex quadratic in $x$, and therefore minimizing $L$ over $x$ is equivalent to solving \begin{equation*} \nabla_x L(x^*,\nu) = 2x^* + \nu\mathbf{1} = 0. \end{equation*} Hence, $x^* = -\frac{\nu}{2}\mathbf{1}$, and therefore the dual function is \begin{equation*} g(\nu) = \inf_{x\in\mathbb{R}^n}L(x,\nu) = L(x^*,\nu) = \frac{\nu^2}{4}\mathbf{1}^\top\mathbf{1} + \nu (-\frac{\nu}{2}\mathbf{1}^\top\mathbf{1} - 1) = -\frac{n \nu^2}{4} - \nu. \end{equation*} This is a concave quadratic in $\nu$, and therefore we can maximize $g$ by setting its derivative to zero: \begin{equation*} g'(\nu^*) = -\frac{n}{2}\nu^* - 1 = 0, \end{equation*} or equivalently, $\nu^* = -\frac{2}{n}$. Substituting this result back into the optimal primal, $x^*$, yields \begin{equation*} x^* = \frac{1}{n}\mathbf{1}, \end{equation*} as you expected.