Margin width and normalization in SVM

57 Views Asked by At

Let $x_1$ and $x_2$ be two support vectors and $w$ an orthogonal vector to the decision hyperplane.

SVM

To find the width of the margin, I don't understand why we have to calculate the dot product between $(x_2-x_1)$ and the normalized vector $\frac{w}{||w||}$, and not the dot product between $(x_2-x_1)$ and $w$. To me, this is the same as projecting $(x_2-x_1)$ onto $\text{span}(w)$, so I don't understand the importance of normalization. Why would the magnitude of $w$ have any impact on the projection measure?