How to project a vector on a set defined by linear inequality constraints through KKT conditions?

185 Views Asked by At

I need to find the projection $x \in \mathbb{R}^{k}$ of a vector $z \in \mathbb{R}^{k}$ on the set defined by $Y \cdot x \geq 0$ where $Y$ is a (given but no specific property) matrix of size $m \cdot k$.

I first build the Lagrangian as follows :

$$L=\frac{1}{2} ||x-z||^2 -\sum_{i=1}^m \lambda_i Y(i,:)x$$ and set its gradient w.r.t. $x$ to $0$ which leads to $$x=z+\sum_{i=1}^m \lambda_i Y(i,:)^T$$

The inequality constraints adds an additional KKT condition :

$$\lambda_i \cdot Y(i,:)x=0 \; \forall i=1,...,m ; \lambda_i \geq 0$$

Replacing $x$ inside gives $$\lambda_i \cdot Y(i,:)[z+\sum_{i=1}^m \lambda_i Y(i,:)^T]=0 \; \forall i=1,...,m$$

It is where it becomes harder for me. I guess that either $\lambda_i$ or $Y(i,:)[z+\sum_{i=1}^m \lambda_i Y(i,:)^T]$ should be equal to $0$ but then I am stuck. Indeed, the second condition is a scalar product linking all the $\lambda_i$ and I don't know how to deal with it properly. Any help appreciated

1

There are 1 best solutions below

9
On

You are right (up to indices relabeling), either $Y(i,:)[z+\sum_{j=1}^m \lambda_j Y(j,:)^T]=0$ or $\lambda_i=0$ for any $i$, Let $\mathcal S=\left\lbrace i : Y(i,:)[z+\sum_{j=1}^m \lambda_j Y(j,:)^T]=0 \right\rbrace$, then \begin{align*} x = z + \sum_{i\in\mathcal S} \lambda_i Y(i,:)^T \end{align*} This satisfies the KKT conditions and is your solution. You cannot get a nicer form except if $Y$ is composed of orthogonal columns or have similar conditions.

Solving for the $\lambda$ parameters requires you to solve the system of $|\mathcal S|$ linear equations \begin{align*} Y(i,:)[z+\sum_{j=1}^m \lambda_j Y(j,:)^T]=0&&\text{for } i\in\mathcal S \end{align*} So it is some self loop argument. But solving those equations is numerically easy and can be done efficiently if the matrix $Y$ have some structure.