Deriving and simplifying the dual of a given convex optimization problem

43 Views Asked by At

I was given the following convex optimization problem:

$\min_{x \in \mathbb {R^n}} \frac{1}{2}\lVert x - y\rVert_2^2$ subject to: $x_i \geq 0$ and $\sum_{i=1}^{n}x_i \leq 1$.

Introducing $\lambda \in \mathbb {R^n}, \mu \in \mathbb {R}$, the Lagrangian then looks like: $L(x, \lambda, \mu) = \frac{1}{2}\lVert x - y\rVert_2^2 - \sum_{i=1}^{n}\lambda_i x_i + \mu (\sum_{i=1}^{n} x_i) - \mu$.

My task was now as follows: "Derive the dual and reduce it to a one-dimensional optimization problem by optimizing explicitly over one dual variable."

Steps: Due to the convexity, find the minimizer of the Lagrangian by setting its gradient to zero (leads to $x = y + \lambda - \mu \cdot 1$ where $1$ is a vector of ones). Plugging this result into the Lagrangian and doing some annoying manipulations then brought me to the global minimum of $L$: $-\frac{1}{2} \lVert \lambda - \mu \cdot 1\rVert_2^2 - \langle y, \lambda - \mu \cdot 1 \rangle - \mu$.

Accordingly, the dual problem is given by (unless I screwed something up):

$\max_{\lambda \in \mathbb {R^n}, \mu \in \mathbb {R}} -\frac{1}{2} \lVert \lambda - \mu \cdot 1\rVert_2^2 - \langle y, \lambda - \mu \cdot 1 \rangle - \mu$ subject to: $\lambda_i \geq 0$ and $\mu \geq 0$.

My question now is how to turn this into a one-dimensional problem by explicility optimizing over one dual variable. Since only $\mu$ is one-dimensional, we must maximize over $\lambda$ but I don't see how this can be done.It should be something obvious like $\lambda$ has to equal $0$ or something of the sorts since this was an exam quesiton but I don't think that's correct. Any ideas? Thanks in advance.