A Matrix Optimization Problem

134 Views Asked by At

Given an $n\times d$ matrix $Y$, I am looking for an algorithm to find an $n$-vector $\mathbf{v}$ ($0\le \mathbf{v}_i\le 1$ for all $i$) that minimizes $\sum_{i:X_i<0}X_i$, where $X:= \mathbf{v} Y$. Is there such an algorithm with running time polynomial in $n$?

If it helps, we also know that if $\forall i, \mathbf{v}_i \in \{0,1\}$ then $\sum_{i:X_i<0}X_i \ge -\sqrt{n}$.

1

There are 1 best solutions below

6
On

Your problem as stated here is non-convex and there is no polynomial time solution for it, unless $P=NP$ (to the best of my knowledge about this problem), but you can approximated it with a convex optimization program as I explain in the following:

Your non-convex program can be formulated in the following form:

Let $v\in R^n$ and let $e_i$ be the $i$th standard basis vector (i.e. the $i$th element of $e_i$ is $1$ and the rest are $0$).

you have:

$\text{minimize}_{v} \sum_i \min(0,e_i^T Y^T v)$

subject to: $0\preceq v\preceq 1$

and the function $\min(0,e_i^T Y^T v)$ is the component that makes it non-convex.

Here is a suggestion, write a convex relaxation of the problem as:

$\text{minimize}_{v,t} \sum_i t_i$

subject to:

$0\preceq v\preceq 1$

$t_i\leq 0$

$ t_i\leq e_i^T Y^T v$

The problem with this convex representation is that you have a feasible solution for $t_i \rightarrow \infty$, so here is where the approximation can help, you want $t_i$ be as close to $e_i^T Y^T v$ as possible, so solve the following program instead:

$\text{minimize}_{v,t} \sum_i t_i + (t_i-e_i^T Y^T v)^2$

subject to:

$0\preceq v\preceq 1$

$t_i\leq 0$

$ t_i\leq e_i^T Y^T v$

This program is convex, and can be solved "efficiently" using any quadratic solver, and approximates what you're looking for. I hope this helps.