Finding the dual of a problem $\min_{w,b} \sum_{j=1}^{n}{\max(0,w_j)}$

31 Views Asked by At

Consider the optimization problem for support vector machine with

$(x_i,y_i),i=1,2,\ldots,m$ is the training data set

$y_i=\{-1,1\}$

$w\in \mathbb R^n$

$\min_{w,b} \sum_{j=1}^{n}{\max(0,w_j)}\\\text{s.t.: } y_i(w^Tx_i+b)\geq1,\forall i.$

Derive the dual of this problem.

Hello I am trying to learn optimization techniques and am new to the concept of dual.

I understand that I am supposed to minimize the Lagrangian with respect to $w$ and $b$ in this case

$L=\sum_{j=1}^{n}{\max(0,w_j)}-\sum_{i=1}^{F}{\lambda_i[(w^Tx_i+b)-1]},$

but I am unsure how to proceed in carrying out the proper differentiation.

If someone can demonstrate deriving the dual of this problem.