Support Vector Machine - Existence of Canonical HyperPlane

267 Views Asked by At

I'm studying Linear Support Vector Machine and I have a question about the existence of cannonical hyperplane.

Here I'm considering $X = \{x_i,y_i\}_{i=1}^{n}$, $x_i \in \mathbb{R^n}$ and $y_i \in \{-1,1\}$, a linear separable training set. i.e, there is a hyperplane $H: f(x) = w\cdot x + b = 0$ that divides $X$ in two sets $A^+ = \{x_i \in X: y_i = 1\}$ and $A^- = \{x_i \in X: y_i = -1\}$.

So we can define the follow classifier $g(x) = 1$ if $f(x)>0$ and $g(x) = -1$ if $f(x)< 0$.

Now we consider the canonical hiperplane that satisfies the following: There are $x^+ \in A^+$ and $x^- \in A^-$ such that $w \cdot x^+ + b = 1$ and $w \cdot x^- + b = -1$.

Question: How can I ensure the existence of this hyperplane?

My incomplete attempt to explain this. Consider $x^+ \in A^+$ that satisfies $w \cdot x^+ + b = min_{x \in A^+}w \cdot x +b = k >0$ $(1)$ and $x^- \in A^-$ that satisfies $w \cdot x^- + b = max_{x \in A^-}w \cdot x +b = -r <0$ $(2)$, dividing $(1)$ and $(2)$ by $k$ and $r$ respectvely, we get:

$\dfrac{w}{k} \cdot x^+ + \dfrac{b}{k} = 1$ and $\dfrac{w}{r} \cdot x^- + \dfrac{b}{r} = -1$ and here I get stuck.

Thank you!