linear hard-margin SVM Proof of Optimal w* and b*

398 Views Asked by At

Hello i have this task about hard-margin SVM with only 2 Data points input. Does anyone know how to approach this task? I think it doesnt require to make a numerical example and solving it with the lagrangian method. But how else can i approach that? I attach a picture of the assignment.Assignment SVM

1

There are 1 best solutions below

0
On

Well, you can visualize the problem. Hard-margin SVM searches for the maximum separating hyperplane. Let's take: $$\ X=\{(1, 0), \ (0, 1)\}\\ Y=\{+1, -1\} $$ You can clearly see that maximum separating hyperplane(line in this toy example) is plane(line) $x-y=0$. We have to come up with a $w$ for the plane(line). Note that $b=0$.

You can check $w^*=[-1, 1]^T$ is the solution. This is no different from Lagrangian, just that we have considered a very easy case where we can compute $w^*$ by simple geometry and algebra.

Hint: SVM only cares about support vectors. They are very few compared to the number of samples. If you can identify the support vectors geometrically, rest all data points(samples) are of no use and do not affect $w^*$. Well software cannot use this geometric trick and even you can't in higher dimensions or tougher cases.