Understanding vector calculation in support vectors

43 Views Asked by At

I was going through Andrew Ng's lecture notes for Machine Learning course at Stanford. I have small possibly silly doubt.

On page 48, 49 of the pdf, it defines geometric margin ($\gamma^{(i)}$) between a training point point $A=x^{(i)}$ and a decision boundary ($w^Tx+b=0$) in th context of SVMs:
enter image description here
It says following:

How can we find the value of $γ^{(i)}$? Well, $w/\Vert w\Vert$ is a unit-length vector pointing in the same direction as $w$. Since $A$ represents $x(i)$, we therefore find that the point $B$ is given by $x^{(i)}\gamma{(i)}.w/\Vert w\Vert$. But this point lies on the decision boundary, and all points $x$ on the decision boundary satisfy the equation $w^Tx+b=0$. Hence, $$w^T\left(x^{(i)}-\gamma^{(i)}\frac{w}{\Vert w\Vert}\right)+b=0$$. Solving for $\gamma^{(i)}$ yields, $$\gamma^{(i)}= \frac{w^Tx^{(i)}+b}{\Vert w \Vert}$$

My doubt is, I am getting $\gamma^{(i)}= w^Tx^{(i)}+b$:

$$w^Tx^{(i)}-w^T\gamma^{(i)}\frac{w}{\Vert w\Vert}+b=0$$ $$w^Tx^{(i)}-\gamma^{(i)}+b=0 \quad\quad(\because w^Tw=\Vert w\Vert)$$ $$\therefore\gamma^{(i)}= w^Tx^{(i)}+b$$

Am I plain wrong here? Or something more is going on here?

1

There are 1 best solutions below

1
On BEST ANSWER

The error is where you have $w^Tw = ||w||$. This should be $w^Tw = ||w||^2$ because $||w|| = \sqrt{w^Tw}$. Using the correction we have $$w^T\bigg(x^{(i)} - \gamma^{(i)}\frac{w}{||w||}\bigg) + b = 0$$ $$w^Tx^{(i)} - \gamma^{(i)}\frac{||w||^2}{||w||} + b = 0$$ $$w^Tx^{(i)} - \gamma^{(i)}||w|| + b = 0$$ This leads to $$\gamma^{(i)} = \frac{w^Tx^{(i)} + b}{||w||}$$