computing a quadratic programming problem arising in non-linear SVM

196 Views Asked by At

According to Wikipedia, we need to solve a specific quadratic programming problem in order to use the SVM algorithm with kernels. I would like to solve this quadratic problem using a python library called CVXOPT, but this routine needs the following form :

minimise : $1/2 x^T P x + q^T x$

subject to : $ Gx \leq h$ and $Ax=b$

in the context of support-vector machine, I have to solve the following quadratic problem :

maximise : $$f(c_1,...,c_n) = \sum_{i=1}^n c_i - \frac{1}{2} \sum_{i=1}^n \sum_{j=1}^n y_i c_i k(x_i,x_j) y_j c_j$$

subject to : $\sum_{i=1}^n c_i y_i =0$ and $0 \leq c_i \leq \frac{1}{2 n \lambda}$ for all i.

In this context k() is a positive definite kernel function.

I don't know hot to put my second form into the form needed by CVXOPT. Can you help me ? Precisely, what would be $P,q,G,h,A,b$ in this context ?

1

There are 1 best solutions below

5
On BEST ANSWER

First difference: cvxopt takes in a minimization problem but you are having a maximumzation problem. This can be resolved by multiplying $-1$ to the maximization objective function.

$P$ describes the quadratic part, Hence let $P_{ij}=y_ik(x_i, x_j)y_j$. The linear objective part is $q\in \mathbb{R}^n$ where $q_i = -1$.

Next let's move on to the inequality constraint, which consists of $c_i \le \frac1{2n\lambda}$ and $-c_i \le 0$. Hence, we can let $G=\begin{bmatrix} I \\ -I\end{bmatrix}$. As an exercise, can you try to write down the corresponding $h$?

$b=0$. As an exercise, can you write down the vector $A$?