Given the following optimization problem (Orthogonal Projection):
$$ {\mathcal{P}}_{\mathcal{T}} \left( x \right) = \arg \min _{y \in \mathcal{T} } \left\{ \frac{1}{2} {\left\| x - y \right\|}^{2} \right\} $$
Where $ \mathcal{T} = \left\{ x \mid {e}^{T} x = k, \; \forall i, \: 0 \leq {x}_{i} \leq 1 \right\} $ and $ \forall i, \,{e}_{i} = 1 $ and $k $ is known.
I tried solving it using KKT yet couldn't get into a solution.
I was able to solve it using CVX yet I wanted a method I can see what happens.
- Could anyone solve it using KKT?
- How can solve it using iterated method? It seems to fit Projected Sub Gradient / Dual Projected Subgradient yet I couldn't calculate the items needed.
Thank You.
This is a Community Wiki Solution, Feel free to edit and add.
I will point up and mark solution for any other solution made by the community.
KKT
The Lagrangian is given by:
$$ L \left( y, \lambda \right) = \frac{1}{2} {\left\| y - x \right\|}^{2} + \mu \left( {e}^{T} y - k \right) - {\lambda}_{1}^{T} y + {\lambda}_{2}^{T} \left( y - e \right) $$
The KKT Constraints:
\begin{align} \left( 1 \right) \; & {\nabla}_{y} L \left( y, \lambda \right) = y - x + \mu e - {\lambda}_{1} + {\lambda}_{2} & = 0 \\ \left( 2 \right) \; & {\nabla}_{\mu} L \left( y, \lambda \right) = {e}^{T} y - k & = 0 \\ \left( 3 \right) \; & -{\lambda}_{1}^{T} y & = 0 \\ \left( 4 \right) \; & {\lambda}_{2}^{T} \left( y - e \right) & = 0 \\ \left( 5 \right) \; & {\lambda}_{1} & \geq 0 \\ \left( 6 \right) \; & {\lambda}_{2} & \geq 0 \end{align}
Multiplying $ \left( 1 \right) $ by $ {e}^{T} $ and using $ \left( 2 \right) $ yields:
$$ {e}^{T} y - {e}^{T} x + \mu {e}^{T} e +{e}^{T} {\lambda}_{2} \Rightarrow \mu = \frac{ {e}^{T} x - k }{ n - {e}^{T} {\lambda}_{1} + {e}^{T} {\lambda}_{2} } $$
Plugging the result into $ \left( 1 \right) $ yields
Seems to be hard to get analytic solution.
Any other way to solve this system of equations?
Under Work...
Feel free to continue.
Projected Subgradient
Dual Projected Sub Gradient
Given a Problem in the form:
\begin{align*} \arg \min_{x} & \quad f \left( x \right) \\ s.t. & \quad {g}_{i} \left( x \right) \leq 0 , \; i = 1, 2, \cdots, m \\ & \quad x \in \mathcal{S} \end{align*}
Where
Then, from Amir Beck's Lecture Notes, The Dual Projected Sub Gradient is given by:
In this problem $ f \left( y \right) = \frac{1}{2} { \left\| y - x \right\| }^{2} $, $ i = 1, 2, ..., n, \; {g}_{i} \left( y \right) = -{y}_{i} $, $ i = n + 1, n + 2, ..., 2n, \; {g}_{i} \left( y \right) = {y}_{i} - 1 $ and $ \mathcal{S} = \left\{ x \mid {e}^{T} x = k \right\} $.
The Sub Problem $ {x}_{k} = \arg \min_{x \in \mathcal{S}} \left\{ f \left( x \right) + \sum_{i = 1}^{m} {\lambda}_{i}^{k} {g}_{i} \left( x \right) \right\} $ should be solved using Projected Sub Gradient.
In this case the Projection Operator is given by $ {\mathcal{P}}_{{e}^{T} x = k} \left( x \right) = x - e {\left( {e}^{T} e \right)}^{-1} \left( {e}^{T} x - b \right) $.
The Gradient of $ L \left( y, \lambda \right) = \frac{1}{2} { \left\| y - x \right\| }^{2} + \sum_{i = 1}^{m} {\lambda}_{i} {g}_{i} \left( x \right) $ is given by $ {\nabla}_{y} L \left( y, \lambda \right) = y - x + {\left[ \left( {\lambda}_{n + 1} - {\lambda}_{1} \right), \left( {\lambda}_{n + 2} - {\lambda}_{2} \right), \cdots, \left( {\lambda}_{2n} - {\lambda}_{n} \right) \right]}^{T} $.
This is a MATLAB code which implements the method: