Given $f$ belongs to a finite dimensional subspace of an RKHS, how can we project it onto the subspace where $f>0$?

94 Views Asked by At

Consider $f$ belongs to an RKHS with Gaussian kernels:

$$ f(\cdot)=\sum_{i=1}^{n} a_i K(b_i,\cdot)$$

where $b_i$'s are given. I now wish to find

$$ g(\cdot)=\sum_{i=1}^{n} a_i' K(b_i,\cdot)$$

such that $\|f-g\|_\mathcal{H}^2$ is minimized, and $g(\cdot)>0$.

To formulate the objective, it is not hard to see , by the fact that $\langle K(x,\cdot),K(y,\cdot)\rangle=K(x,y)$, that minimizing $\|f-g\|_\mathcal{H}^2$ can be written as minimizing $(a-a')^\top M (a-a')$, where $a,b$ are coefficient vectors and $M_{ij}=K(b_i,b_j)$. $M$ is positive definite, and hence the problem is essentially a quadratic programming problem. The difficulty I'm experiencing is to find an efficient way to represent the constraint $g(\cdot)>0$.

For any given $t$, $g(t)>0$ is equivalent to a linear constraint for $a$. However $g(\cdot)>0$ makes the number of coefficients infinite. One can obviously discretize the time axis, but the cost of computation is high, and the constraint becomes weaker.

Is there any efficient way to represent the constraint $g(\cdot)>0$ as linear constraints over $a$?

1

There are 1 best solutions below

0
On BEST ANSWER

Unless you can somehow use special properties of your kernel functions, the best solution to your problem is probably to use a cutting plane outer approximation:

  1. Find the best approximation without the nonnegativity constraints.

  2. Find one or more violated linear inequalities involving the coefficients $a$.

  3. Reoptimize with the new set of inequality constraints. It is helpful to use a solver which supports "warm starts."

  4. Repeat until $g$ is close enough to non-negative.

These kinds of cutting plane approaches are well established in convex optimization and can work reasonably well in practice.